next up previous Postscript version of this file


STAT 870 Lecture 10

Initial Distributions

Meaning of unconditional expected values?

Markov property specifies only conditional probabilities; no way to deduce marginal distributions.

For every distribution $\pi$ on S and transition matrix ${\bf P}$ there is a a stochastic process $X_0,X_1,\ldots$ with

\begin{displaymath}P(X_0=k) = \pi_k
\end{displaymath}

and which is a Markov Chain with transition matrix ${\bf P}$. corresponding probability measure.

Notice Strong Markov Property proof used only conditional expectations.

Notation: $\pi$ a probability on S. ${\rm E}^\pi$ and $P^\pi$are expected values and probabilities for chain with initial distribution $\pi$.

Summary of easily verified facts:

$\bullet$ For any sequence of states $i_0,\ldots,i_k$

\begin{displaymath}P(X_0=i_0,\ldots,X_k=i_k) = \pi_{i_0}{\bf P}_{i_0i_1}\cdots {\bf P}_{i_{k-1}i_k}
\end{displaymath}

$\bullet$ For any event A:

\begin{displaymath}{\bf P}^\pi(A) = \sum_k \pi_k {\bf P}^k(A)
\end{displaymath}

$\bullet$ For any bounded rv $Y=f(X_0,\ldots)$

\begin{displaymath}{\rm E}^\pi(Y) = \sum_k \pi_k {\rm E}^k(A)
\end{displaymath}

Recurrence and Transience

Now consider a transient state k, that is, a state for which

\begin{displaymath}f_k = P^k(T_k < \infty) < 1
\end{displaymath}

Note that $T_k= \min\{n>0:X_n=k\}$ is a stopping time. Let Nk be the number of visits to state k. That is

\begin{displaymath}N_k = \sum_{n=0}^\infty 1(X_n=k)
\end{displaymath}

Notice that if we define the function

\begin{displaymath}f(x_0,x_1,\ldots) = \sum_{n=0}^\infty 1(x_n=k)
\end{displaymath}

then

\begin{displaymath}N_k = f(X_0,X_1,\ldots)
\end{displaymath}

Notice, also, that on the event $T_k < \infty$

\begin{displaymath}N_k = 1+f(X_{T_k},X_{T_k+1},\ldots)
\end{displaymath}

and on the event $T_k=\infty$ we have

Nk=1

In short:

\begin{displaymath}N_k =1 +f(X_{T_k},X_{T_k+1},\ldots)1(T_k < \infty)
\end{displaymath}

Hence
\begin{align*}{\bf P}^k&( N_k=r)
\\
& = {\rm E}^k\left\{ P ( N_k=r\vert{\cal F...
...^k\left\{1(T_k<\infty)\right\}P^k(N_k=r-1)
\\
& = f_k P^k(N_k=r-1)
\end{align*}

It is easily verified by induction, then, that

\begin{displaymath}{\bf P}^k( N_k=r) = f_k^{r-1}P^k(N_k=1)
\end{displaymath}

But Nk=1 if and only if $T_k=\infty$ so

\begin{displaymath}{\bf P}^k( N_k=r) = f_k^{r-1}(1-f_k)
\end{displaymath}

so Nk has (chain starts from k) Geometric dist'n, mean 1/(1-fk). Argument also shows that if fk=1 then

\begin{displaymath}P^k(N_k=1)=P^k(N_k=2) = \cdots
\end{displaymath}

which can only happen if all these probabilities are 0. Thus if fk=1

\begin{displaymath}P(N_k=\infty) = 1
\end{displaymath}

Since

\begin{displaymath}N_k = \sum_{n=0}^\infty 1(X_n=k)
\end{displaymath}


\begin{displaymath}{\rm E}^k(N_k) = \sum_{n=0}^\infty ({\bf P}^n)_{kk}
\end{displaymath}

So: State k is transient if and only if

\begin{displaymath}\sum_{n=0}^\infty ({\bf P}^n)_{kk} < \infty
\end{displaymath}

and this sum is 1/(1-fk).

Proposition 1   Recurrence (or transience) is a class property. That is, if i and j are in the same communicating class then i is recurrent (respectively transient) if and only if j is recurrent (respectively transient).

Proof: Suppose i is recurrent and $i{\leftrightarrow}j$. There are integers m and n such that

\begin{displaymath}({\bf P}^m)_{ji} > 0 \quad \text{ and } \quad ({\bf P}^n)_{ij} > 0
\end{displaymath}

Then
\begin{align*}\sum_k ({\bf P}^k)_{jj} &\ge \sum_{k\ge 0} ({\bf P}^{m+k+n})_{jj}
...
...)_{ji}\left\{\sum_{k\ge 0}({\bf P}^k)_{ii}\right\}({\bf P}^n)_{ij}
\end{align*}
The middle term is infinite and the two outside terms positive so

\begin{displaymath}\sum_k ({\bf P}^k)_{jj}=\infty
\end{displaymath}

which shows j is recurrent.

A finite state space chain has at least one recurrent state:

If all states we transient we would have for each k $P(N_k < \infty) = 1$. This would mean $P(\forall k\, . N_k <\infty)=1$. But for any $\omega$ there must be at least one k for which $N_k=\infty$(the total of a finite list of finite numbers is finite).

Infinite state space chain may have all states transient:

The chain Xn satisfying Xn+1=Xn+1 on the integers has all states transient.

More interesting example:

$\bullet$ Toss a coin repeatedly.

$\bullet$ Let Xn be X0 plus the number of heads minus the number of tails in the first n tosses.

$\bullet$ Let p denote the probability of heads on an individual trial.

Xn-X0 is a sum of n iid random variables Yiwhere P(Yi=1)=p and P(Yi=-1)=1-p.

SLLN shows Xn/n converges almost surely to 2p-1. If $p\neq 1/2$ this is not 0.

In order for Xn/n to have a positive limit we must have $X_n \to \infty$ almost surely so all states are visited only finitely many times. That is, all states are transient. Similarly for p<1/2 $X_n \to -\infty$almost surely and all states are transient.

Now look at p=1/2. The law of large numbers argument no long shows anything. I will show that all states are recurrent.

Proof: We show $\sum_n ({\bf P}^n)_{00}$ and show the sum is infinite. If n is odd then (pn)00 = 0 so we evaluate

\begin{displaymath}\sum_m({\bf P}^{2m})_00
\end{displaymath}

Now

\begin{displaymath}({\bf P}^{2m})_00 = \dbinom{2m}{m} 2^{-2m}
\end{displaymath}

According to Stirling's approximation

\begin{displaymath}\lim_{m\to \infty} \frac{m!}{m^{m+1/2} e^{-m}\sqrt{2\pi}} = 1
\end{displaymath}

Hence

\begin{displaymath}\lim_{m\to \infty} \sqrt{m} ({\bf P}^{2m})_00 = \frac{1}{\sqrt{\pi}}
\end{displaymath}

Since

\begin{displaymath}\sum \frac{1}{\sqrt{m}} = \infty
\end{displaymath}

we are done.

Mean return times

Compute expected times to return. For $x\in S$ let Tx denote the hitting time for x.

Suppose x recurrent in irreducible chain (only one communicating class).

Derive equations for expected values of different Tx.

Each Tx is a certain function fx applied to $X_1,\ldots$. Setting $\mu_{ij}= {\rm E}^i(T_j)$ we find

\begin{displaymath}\mu_{ij} = \sum_{k} {\rm E}^i(T_j1(X_1=k))
\end{displaymath}

Note that if X1=x then Tx=1 so

\begin{displaymath}{\rm E}^i(T_j1(X_1=j)) = {\bf P}_{ij}
\end{displaymath}

For $k \neq j$

\begin{displaymath}T_x=1+f_x(X_2,X_3,\ldots)
\end{displaymath}

and, by conditioning on X1=k we find

\begin{displaymath}{\rm E}^i(T_j1(X_1=k)) = {\bf P}_{ik}\left\{1+{\rm E}^k(T_j)\right\}
\end{displaymath}

This gives

 \begin{displaymath}
\mu_{ij} = 1+\sum_{k \neq j} {\bf P}_{ik}\mu_{kj}
\end{displaymath} (1)

Technically, I should check that the expectations in (1) are finite. All the random variables involved are non-negative, however, and the equation actually makes sense even if some terms are infinite. (To prove this you actually study

\begin{displaymath}T_{x,n} = \min(T_x,n)
\end{displaymath}

deriving an identity for a fixed n, letting $n \to \infty$ and applying the monotone convergence theorem.)

Here is a simple example:

\begin{displaymath}{\bf P}= \left[\begin{array}{ccc}
0 & \frac{1}{2} & \frac{1}{...
...ac{1}{2}
\\
\frac{1}{2} & \frac{1}{2} & 0
\end{array}\right]
\end{displaymath}

The identity (1) becomes
\begin{align*}\mu_{1,1} & = 1+ \frac{1}{2} \mu_{2,1} + \frac{1}{2} \mu_{3,1}
\\ ...
...\\
\mu_{3,3} & = 1 + \frac{1}{2} \mu_{1,3} + \frac{1}{2} \mu_{2,3}
\end{align*}

Seventh and fourth show $ \mu_{2,1}=\mu_{3,1}$. Similar calculations lead to $\mu_{ii}=3$ and, for $i\neq j$ $\mu_{i,j} = 2$.

Example: The coin tossing Markov Chain with p=1/2 shows that the situation can be different when S is infinite. The equations above become:
\begin{align*}m_{0,0} & = 1 + \frac{1}{2}m_{1,0}+\frac{1}{2} m_{-1,0}
\\
m_{1,0} & = 1 + \frac{1}{2} m_{2,0}
\end{align*}
and many more.

Some observations:

You have to go through 1 to get to 0 from 2 so

m2,0=m2,1+m1,0

Symmetry (switching H and T):

m1,0=m-1,0

The transition probabilities are homogeneous:

m2,1=m1,0

Conclusion:


\begin{align*}m_{0,0} & = 1 + m_{1,0}
\\
& = 1+ 1 + \frac{1}{2} m_{2,0}
\\
& = 2 + m_{1,0}
\end{align*}
Notice that there are no finite solutions!

Summary of the situation:

Every state is recurrent.

All the expected hitting times mij are infinite.

All entries ${\bf P}^n_{ij}$ converge to 0.

Jargon: The states in this chain are null recurrent.


next up previous



Richard Lockhart
2000-10-17