next up previous


Postscript version of this file

STAT 380 Lecture 5

Reading for Today's Lecture: Chapter 3

Goals of Today's Lecture:

Conditional Expectations

If X, Y, two discrete random variables then

\begin{displaymath}{\rm E}(Y\vert X=x) = \sum_y y P(Y=y\vert X=x)
\end{displaymath}

Extension to absolutely continuous case:

Joint pmf of X and Y is defined as

p(x,y) = P(X=x,Y=y)

Notice: The pmf of X is

\begin{displaymath}p_X(x) = \sum_y p(x,y)
\end{displaymath}

Analogue for densities: joint density of X,Y is

\begin{displaymath}f(x,y) dx dy \approx P(x \le X \le x+dx, y \le Y \le y+dy)
\end{displaymath}

Interpretation is that

\begin{displaymath}P(X \in A, Y \in B) = \int_A \int_B f(x,y) dy dx
\end{displaymath}

Property: if X,Y have joint density f(x,y) then X has density

\begin{displaymath}f_X(x) = \int_y f(x,y) dy
\end{displaymath}

Sums for discrete rvs are replaced by integrals.

Example:

\begin{displaymath}f(x,y) = \begin{cases}x+y & 0 \le x,y \le 1
\\
0 & \text{otherwise}
\end{cases}\end{displaymath}

is a density because
\begin{align*}\iint f(x,y)dx dy & = \int_0^1\int_0^1 (x+y) dy dx
\\
& = \int_0^1 x dx + \int_0^1 y dy
\\
& = \frac{1}{2} + \frac{1}{2} = 1
\end{align*}

The marginal density of X is, for $0 \le x \le 1$.
\begin{align*}f_X(x) & = \int_{-\infty}^\infty f(x,y)dy
\\
& = \int_0^1 (x+y) dy
\\
& = \left.(xy+y^2/2)\right\vert _0^1
= x+\frac{1}{2}
\end{align*}

For x not in [0,1] the integral is 0 so

\begin{displaymath}f_X(x) = \begin{cases}x+\frac{1}{2} & 0 \le x \le 1
\\
0 & \text{otherwise}
\end{cases}\end{displaymath}

Conditional Densities

If X and Y have joint density fX,Y(x,y) then we define the conditional density of Y given X=x by analogy with our interpretation of densities. We take limits:
\begin{multline*}f_{Y\vert X}(y\vert x)dy \\
\approx \frac{ P(x \le X \le x+dx, y \le Y \le y+dy)}{P(x
\le X \le x+dx)}
\end{multline*}
in the sense that if we divide through by dy and let dx and dy tend to 0 the conditional density is the limit

\begin{displaymath}\frac{\lim_{dx, dy \to 0} \frac{ P(x \le X \le
x+dx, y \le Y ...
...y)}{(dx\,dy)}}{
\lim_{dx\to 0} \frac{P(x \le X \le x+dx)}{dx}}
\end{displaymath}

Going back to our interpretation of joint densities and ordinary densities we see that our definition is just

\begin{displaymath}f_{Y\vert X}(y\vert x) = \frac{f_{X,Y}(x,y)}{f_X(x)}
\end{displaymath}

When talking about a pair X and Y of random variables we refer to fX,Y as the joint density and to fX as the marginal density of X.

Example: For f of previous example conditional density of Y given X=x defined only for $0 \le x \le 1$:

\begin{displaymath}f_{Y\vert X}(y\vert x) = \begin{cases}
\frac{x+y}{x+\frac{1}{...
...\le x \le 1, y < 0
\\
\text{undefined} & otherwise
\end{cases}\end{displaymath}

Example: X a Poisson$(\lambda)$ random variable. Observe X then toss a coin X times. Y is number of heads. P(H) = p
\begin{align*}f_Y(y) & = \sum_x f_{X,Y}(x,y)
\\
& = \sum_x f_{Y\vert X}(y\vert ...
...binom{x}{y} p^y(1-p)^{x-y} \times
\frac{\lambda^x}{x!} e^{-\lambda}
\end{align*}

WARNING: in sum $0 \le y \le x$ is required and x, y integers so sum really runs from y to $\infty$
\begin{align*}f_Y(y) &= \frac{(p\lambda)^ye^{-\lambda}}{y!} \sum_{x=y}^\infty
\...
...\lambda}}{y!}e^{(1-p)\lambda}
\\
& = e^{-p\lambda} (p\lambda)^y/y!
\end{align*}
which is a Poisson($p\lambda$) distribution.

Conditional Expectations

If X and Y are continuous random variables with joint density fX,Y we define:

\begin{displaymath}E(Y\vert X=x) = \int y f_{Y\vert X}(y\vert x) dy
\end{displaymath}

Key properties of conditional expectation

1: If $Y\ge 0$ then ${\rm E}(Y\vert X=x) \ge 0$. Equals iff P(Y=0|X=x)=1.

2: ${\rm E}(A(X)Y+B(X)Z\vert X=x) = A(x){\rm E}(Y\vert X=x) +B(x){\rm E}(Z\vert X=x)$.

3: If Y and X are independent then

\begin{displaymath}{\rm E}(Y\vert X=x) = {\rm E}(Y)
\end{displaymath}

4: ${\rm E}(1\vert X=x) = 1$.


next up previous



Richard Lockhart
2000-10-02