next up previous
Postscript version of this file

STAT 870 Lecture 12

Hitting Times

Start irreducible recurrent chain tex2html_wrap_inline190 in state i. Let tex2html_wrap_inline194 be first n;SPMgt;0 such that tex2html_wrap_inline198 . Define

displaymath200

First step analysis:

align32

Example

displaymath202

The equations are

align55

The second and third equations give immediately

align73

Then plug in to the others to get

align79

Notice stationary initial distribution is

displaymath204

Consider fraction of time spent in state j:

displaymath208

Imagine chain starts in chain i; take expected value.

displaymath212

If rows of tex2html_wrap_inline214 converge to tex2html_wrap_inline216 then fraction converges to tex2html_wrap_inline218 ; i.e. limiting fraction of time in state j is tex2html_wrap_inline218 .

Heuristic: start chain in i. Expect to return to i every tex2html_wrap_inline228 time units. So are in state i about once every tex2html_wrap_inline228 time units; i.e. limiting fraction of time in state i is tex2html_wrap_inline236 .

Conclusion: for an irreducible recurrent finite state space Markov chain

displaymath238

Real proof: Renewal theorem or variant.

Idea: tex2html_wrap_inline240 are times of visits to i. Segment i:

displaymath246

Segments are iid by Strong Markov.

Number of visits to i by time tex2html_wrap_inline250 is exactly k.

Total elapsed time is tex2html_wrap_inline254 where tex2html_wrap_inline256 are iid.

Fraction of time in state i by time tex2html_wrap_inline250 is

displaymath262

by SLLN. So if fraction converges to tex2html_wrap_inline264 must have

displaymath238

Summary of Theoretical Results:

For an irreducible aperiodic positive recurrent Markov Chain:

  1. tex2html_wrap_inline268 converges to a stochastic matrix tex2html_wrap_inline270 .
  2. Each row of tex2html_wrap_inline270 is tex2html_wrap_inline216 the unique stationary initial distribution.
  3. The stationary initial distribution is given by

    displaymath276

    where tex2html_wrap_inline278 is the mean return time to state i from state i.

If the state space is finite an irreducible chain is positive recurrent.

Ergodic Theorem

Notice slight of hand: I showed

displaymath284

but claimed

displaymath286

almost surely which is also true. This is a step in the proof of the ergodic theorem. For an irreducible positive recurrent Markov chain and any f on S such that tex2html_wrap_inline292 :

displaymath294

almost surely. The limit works in other senses, too. You also get

displaymath296

E.g.  fraction of transitions from i to j goes to

displaymath302

For an irreducible positive recurrent chain of period d:

  1. tex2html_wrap_inline306 has d communicating classes each of which forms an irreducible aperiodic positive recurrent chain.
  2. tex2html_wrap_inline310 has a limit tex2html_wrap_inline270 .
  3. Each row of tex2html_wrap_inline270 is tex2html_wrap_inline216 the unique stationary initial distribution.
  4. Stationary initial distribution places probability 1/d on each of the communicating classes in 1.

For an irreducible null recurrent chain:

  1. tex2html_wrap_inline268 converges to 0 (pointwise).
  2. there is no stationary initial distribution.

For an irreducible transient chain:

  1. tex2html_wrap_inline268 converges to 0 (pointwise).
  2. there is no stationary initial distribution.

For a chain with more than 1 communicating class:

  1. If tex2html_wrap_inline328 is a recurrent class the submatrix tex2html_wrap_inline330 of tex2html_wrap_inline214 made by picking out rows i and columns j for which tex2html_wrap_inline338 is a stochastic matrix. The corresponding entries in tex2html_wrap_inline268 are just tex2html_wrap_inline342 so you can apply the conclusions above.
  2. For any transient or null recurrent class the corresponding columns in tex2html_wrap_inline268 converge to 0.
  3. If there are multiple positive recurrent communicating classes then the stationary initial distribution is not unique.

Poisson Processes

Particles arriving over time at a particle detector. Several ways to describe most common model.

Approach 1: numbers of particles arriving in an interval has Poisson distribution, mean proportional to length of interval, numbers in several non-overlapping intervals independent.

For s;SPMlt;t, denote number of arrivals in (s,t] by N(s,t). Jargon: N(A) = number of points in A is a counting process. Model is

  1. N(s,t) has a Poisson tex2html_wrap_inline358 distribution.
  2. For tex2html_wrap_inline360 the variables tex2html_wrap_inline362 are independent.

Approach 2:

Let tex2html_wrap_inline364 be the times at which the particles arrive.

Let tex2html_wrap_inline366 with tex2html_wrap_inline368 by convention. tex2html_wrap_inline256 are called interarrival times.

Then tex2html_wrap_inline372 are independent Exponential random variables with mean tex2html_wrap_inline374 .

Note tex2html_wrap_inline376 is called survival function of tex2html_wrap_inline256 .

Approaches are equivalent. Both are deductions of a model based on local behaviour of process.

Approach 3: Assume:

  1. given all the points in [0,t] the probability of 1 point in the interval (t,t+h] is of the form

    displaymath384

  2. given all the points in [0,t] the probability of 2 or more points in interval (t,t+h] is of the form

    displaymath390

Notation: given functions f and g we write

displaymath396

provided

displaymath398

[Aside: if there is a constant M such that

displaymath402

we say

displaymath404

Notation due to Landau. Another form is

displaymath404

means there is tex2html_wrap_inline408 and M s.t. for all tex2html_wrap_inline412

displaymath414

Idea: o(h) is tiny compared to h while O(h) is (very) roughly the same size as h.]

Generalizations:

  1. First (Poisson) model generalizes to N(s,t] having a Poisson distribution with parameter tex2html_wrap_inline426 for some non-decreasing non-negative function tex2html_wrap_inline428 (called cumulative intensity). Result called inhomogeneous Poisson process.
  2. Exponential interarrival model generalizes to independent non-exponential interarrival times. Result is renewal process or semi-Markov process.
  3. Infinitesimal probability model generalizes to other infinitesimal jump rates. Model specifies infinitesimal generator. Yields other continuous time Markov Chains.


next up previous



Richard Lockhart
Friday October 20 14:09:40 PDT 2000