next up previous


Postscript version of this file

STAT 380 Lecture 7

Reading for Today's Lecture: Chapter 4 sections 1-3.

Goals of Today's Lecture:

Today's notes

Summary of Probability Review

We have reviewed the following definitions:

Tactics:

Tactics for expected values:

Markov Chains

The last names example has the following structure: if, at generation n there are m individuals then the number of sons in the next generation has the distribution of the sum of m independent copies of the random variable X which is the number of sons I have. This distribution does not depend on n, only on the value m of Zn. We call Zn a Markov Chain.

Ingredients of a Markov Chain:

The stochastic process $X_0,X_1,\ldots$ is called a Markov chain if

\begin{displaymath}P\left(X_{k+1} =i_{k+1}\vert X_k=i_k,\ldots,X_0=i_0\right)
=
{\bf P}_{i_k,i_{+1}}
\end{displaymath}

for all $i_0,i_1,\ldots,i_k$ and all k.

The matrix ${\bf P}$ is called a transition matrix.

Example: If X in the last names example has a Poisson$(\lambda)$ distribution then given Zn=k, Zn+1 is like the sum of k independent Poisson$(\lambda)$ random variables which has a Poisson($k\lambda$) distribution. So

\begin{displaymath}{\bf P}= \left[\begin{array}{llll}
1 & 0 & 0 & \cdots
\\
e^{...
...ts
\\
\vdots &
\vdots &
\vdots &
\ddots
\end{array}\right]
\end{displaymath}

Example: Weather: each day is dry (D) or wet (W).

Xn is weather on day n.

Suppose dry days tend to be followed by dry days, say 3 times in 5 and wet days by wet 4 times in 5.

Markov assumption: yesterday's weather irrelevant to prediction of tomorrow's given today's.

Transition Matrix:

\begin{displaymath}{\bf P}= \left[\begin{array}{cc} \frac{3}{5} & \frac{2}{5}
\\
\frac{1}{5} & \frac{4}{5}
\end{array} \right]
\end{displaymath}

Now suppose it is wet today. P(wet in 2 days) ?
\begin{align*}P(X_2=W\vert X_0=W) & = P(X_2=W,X_1=D \vert X_0=W)
\\
& \qquad +...
...,W} P_{W,W}
\\
& = \frac{1}{5}\frac{2}{5} + \frac{4}{5}\frac{3}{5}
\end{align*}
Notice that all the entries in the last line are items in ${\bf P}$.

Look at the matrix product ${\bf P}{\bf P}$:

\begin{displaymath}\left[\begin{array}{ll} \frac{3}{5} & \frac{2}{5}
\\
\frac{1...
... \frac{2}{5}
\\
\frac{1}{5} & \frac{4}{5}
\end{array} \right]
\end{displaymath}

Notice that the probability P(X2=W|X0=W) is exactly the formula for the W,W entry in ${\bf P}{\bf P}$.


next up previous



Richard Lockhart
2000-10-02