next up previous


Postscript version of this file

STAT 380 Lecture 4

Reading for Today's Lecture: Chapters 2, 3

Goals of Today's Lecture:

Example 3: Mean values

Zn = total number of sons in generation n.

Z0=1 for convenience.

Compute ${\rm E}(Z_n)$.

Recall definition of expected value:

If X is discrete then

\begin{displaymath}{\rm E}(X) = \sum_x x P(X=x)
\end{displaymath}

If X is absolutely continuous then

\begin{displaymath}{\rm E}(X) = \int_{-\infty}^\infty x f(x) dx
\end{displaymath}

Theorem: If Y=g(X), X has density f then

\begin{displaymath}{\rm E}(Y) = {\rm E}(g(X)) =\int g(x) f(x) dx
\end{displaymath}

Key properties of ${\rm E}$:

1: If $X\ge 0$ then ${\rm E}(X) \ge 0$. Equals iff P(X=0)=1.

2: ${\rm E}(aX+bY) = a{\rm E}(X) +b{\rm E}(Y)$.

3: If $0 \le X_1 \le X_2 \le \cdots$ then

\begin{displaymath}{\rm E}(\lim X_n) = \lim {\rm E}(X_n)
\end{displaymath}

4: ${\rm E}(1) = 1$.

Conditional Expectations

If X, Y, two discrete random variables then

\begin{displaymath}{\rm E}(Y\vert X=x) = \sum_y y P(Y=y\vert X=x)
\end{displaymath}

Extension to absolutely continuous case:

Joint pmf of X and Y is defined as

p(x,y) = P(X=x,Y=y)

Notice: The pmf of X is

\begin{displaymath}p_X(x) = \sum_y p(x,y)
\end{displaymath}

Analogue for densities: joint density of X,Y is

\begin{displaymath}f(x,y) dx dy \approx P(x \le X \le x+dx, y \le Y \le y+dy)
\end{displaymath}

Interpretation is that

\begin{displaymath}P(X \in A, Y \in B) = \int_A \int_B f(x,y) dy dx
\end{displaymath}

Property: if X,Y have joint density f(x,y) then X has density

\begin{displaymath}f_X(x) = \int_y f(x,y) dy
\end{displaymath}

Sums for discrete rvs are replaced by integrals.


next up previous



Richard Lockhart
2000-10-02