next up previous
Postscript or PDF version of this file.
Reading: Ch 1 Sec 3, Ch 4 Sec 2

STAT 450: Statistical Theory

Independence, conditional distributions

So far density of $ X$ specified explicitly. Often modelling leads to a specification in terms of marginal and conditional distributions.


Def'n: Events $ A$ and $ B$ are independent if

$\displaystyle P(AB) = P(A)P(B) \, .
$

(Notation: $ AB$ is the event that both $ A$ and $ B$ happen, also written $ A\cap B$.)


Def'n: $ A_i$, $ i=1,\ldots,p$ are independent if

$\displaystyle P(A_{i_1} \cdots A_{i_r}) = \prod_{j=1}^r P(A_{i_j})
$

for any $ 1 \le i_1 < \cdots < i_r \le p$.


Example: $ p=3$

$\displaystyle P(A_1A_2A_3)$ $\displaystyle =$ $\displaystyle P(A_1)P(A_2)P(A_3)$  
$\displaystyle P(A_1A_2)$ $\displaystyle =$ $\displaystyle P(A_1)P(A_2)$  
$\displaystyle P(A_1A_3)$ $\displaystyle =$ $\displaystyle P(A_1)P(A_3)$  
$\displaystyle P(A_2A_3)$ $\displaystyle =$ $\displaystyle P(A_2)P(A_3)$  

All these equations needed for independence!

Example: Toss a coin twice.

$\displaystyle A_1$ $\displaystyle = \{$first toss is a Head$\displaystyle \}$    
$\displaystyle A_2$ $\displaystyle = \{$second toss is a Head$\displaystyle \}$    
$\displaystyle A_3$ $\displaystyle = \{$first toss and second toss different$\displaystyle \}$    

Then $ P(A_i) =1/2$ for each $ i$ and for $ i \neq j$

$\displaystyle P(A_i \cap A_j) = \frac{1}{4}
$

but

$\displaystyle P(A_1 \cap A_2 \cap A_3) = 0 \neq P(A_1)P(A_2)P(A_3) \, .
$

Def'n: $ X$ and $ Y$ are independent if

$\displaystyle P(X \in A; Y \in B) = P(X\in A)P(Y\in B)
$

for all $ A$ and $ B$.

Def'n: Rvs $ X_1,\ldots,X_p$ independent:

$\displaystyle P(X_1 \in A_1, \cdots , X_p \in A_p ) = \prod P(X_i \in A_i)
$

for any $ A_1,\ldots,A_p$.

Theorem:

  1. If $ X$ and $ Y$ are independent then for all $ x,y$

    $\displaystyle F_{X,Y}(x,y) = F_X(x)F_Y(y) \, .
$


  2. If $ X$ and $ Y$ are independent with joint density $ f_{X,Y}(x,y)$ then $ X$ and $ Y$ have densities $ f_X$ and $ f_Y$, and

    $\displaystyle f_{X,Y}(x,y) = f_X(x) f_Y(y) \, .
$


  3. If $ X$ and $ Y$ independent with marginal densities $ f_X$ and $ f_Y$ then $ (X,Y)$ has joint density

    $\displaystyle f_{X,Y}(x,y) = f_X(x) f_Y(y) \, .
$


  4. If $ F_{X,Y}(x,y) = F_X(x)F_Y(y)
$ for all $ x,y$ then $ X$ and $ Y$ are independent.


  5. If $ (X,Y)$ has density $ f(x,y)$ and there exist $ g(x)$ and $ h(y)$ st $ f(x,y) = g(x) h(y)
$ for (almost) all $ (x,y)$ then $ X$ and $ Y$ are independent with densities given by

    $\displaystyle f_X(x) = g(x)/\int_{-\infty}^\infty g(u) du
$

    $\displaystyle f_Y(y) = h(y)/\int_{-\infty}^\infty h(u) du \, .
$

Proof: See STAT 802

Theorem: If $ X_1,\ldots,X_p$ are independent and $ Y_i =g_i(X_i)$ then $ Y_1,\ldots,Y_p$ are independent. Moreover, $ (X_1,\ldots,X_q)$ and $ (X_{q+1},\ldots,X_{p})$ are independent.

Conditional probability

Def'n: $ P(A\vert B) = P(AB)/P(B)$ if $ P(B) \neq 0$.

Def'n: For discrete $ X$ and $ Y$ the conditional probability mass function of $ Y$ given $ X$ is

$\displaystyle f_{Y\vert X}(y\vert x)$ $\displaystyle = P(Y=y\vert X=x)$    
  $\displaystyle = f_{X,Y}(x,y)/f_X(x)$    
  $\displaystyle = f_{X,Y}(x,y)/\sum_t f_{X,Y}(x,t)$    

For absolutely continuous $ X$ $ P(X=x) = 0$ for all $ x$. What is $ P(A\vert X=x)$ or $ f_{Y\vert X}(y\vert x)$? Solution: use limit

$\displaystyle P(A\vert X=x) = \lim_{\delta x \to 0} P(A\vert x \le X \le x+\delta x)
$

If, e.g., $ X,Y$ have joint density $ f_{X,Y}$ then with $ A=\{ Y \le y\}$ we have

$\displaystyle P(A\vert x \le X$ $\displaystyle \le x+\delta x)$    
  $\displaystyle = \frac{P(A \cap \{ x \le X \le x+\delta x\} ) }{P(x \le X \le x+\delta x)}$    
  $\displaystyle = \frac{ \int_{-\infty}^y \int_x^{x+\delta x} f_{X,Y}(u,v)dudv }{ \int_x^{x+\delta x} f_X(u) du }$    

Divide top, bottom by $ \delta x$; let $ \delta x \to 0$. Denom converges to $ f_X(x)$; numerator converges to

$\displaystyle \int_{-\infty}^y f_{X,Y}(x,v) dv
$

Define conditional cdf of $ Y$ given $ X=x$:

$\displaystyle P(Y \le y \vert X=x) = \frac{
\int_{-\infty}^y f_{X,Y}(x,v) dv
}{
f_X(x)
}
$

Differentiate wrt $ y$ to get def'n of conditional density of $ Y$ given $ X=x$:

$\displaystyle f_{Y\vert X}(y\vert x) = f_{X,Y}(x,y)/f_X(x) \, ;
$

in words ``conditional = joint/marginal''.

next up previous



Richard Lockhart
2002-09-09