next up previous
Postscript version of this file

STAT 870 Lecture 16

Kolmogorov's Equations

The Chapman-Kolmogorov equations are

displaymath305

Subtract tex2html_wrap_inline307 from both sides, divide by h and let tex2html_wrap_inline311 . Remember that tex2html_wrap_inline313 is the identity. We find

displaymath315

which gives

displaymath317

The Chapman-Kolmogorov equations can also be written

displaymath319

Now subtracting tex2html_wrap_inline307 from both sides, dividing by h and letting tex2html_wrap_inline311 gives

displaymath327

Look at these equations in component form:

displaymath327

becomes

displaymath331

For tex2html_wrap_inline333 our calculations of instantaneous transition rates gives

displaymath335

For i = k we have

displaymath339

(X(h)=i either means tex2html_wrap_inline343 which has probability tex2html_wrap_inline345 or there have been two or more transitions in [0,h], a possibility of probability o(h).) Thus

displaymath351

Let tex2html_wrap_inline353 be the matrix with entries

displaymath355

tex2html_wrap_inline353 is the infinitesimal generator of the chain.

Thus

displaymath327

becomes

align54

Called Kolmogorov's backward equations.

On the other hand

displaymath317

becomes

align64

These are Kolmogorov's forward equations.

Remark: When the state space is infinite the forward equations may not be justified. In deriving them we interchanged a limit with an infinite sum; the interchange is always justified for the backward equations but not for forward.

Example: tex2html_wrap_inline363 . Then

displaymath365

and the chain is otherwise specivied by tex2html_wrap_inline367 and tex2html_wrap_inline369 . The matrix tex2html_wrap_inline353 is

displaymath373

The backward equations become

align80

while the forward equations are

align94

Add first and third backwward equations to get

displaymath375

so

displaymath377

Put t=0 to get tex2html_wrap_inline381 . This gives

displaymath383

Plug this back in to the first equation and get

displaymath385

Multiply by tex2html_wrap_inline387 and get

displaymath389

which can be integrated to get

displaymath391

Alternative calculation:

displaymath373

can be written as

displaymath395

where

displaymath397

displaymath399

and

displaymath401

Then

align156

Now

displaymath403

so we get

align178

where

displaymath405

Notice: rows of tex2html_wrap_inline407 are a stationary initial distribution. If rows are tex2html_wrap_inline409 then

displaymath411

so

displaymath413

Moreover

displaymath415

Fact: tex2html_wrap_inline417 is long run fraction of time in state 0.

Fact:

displaymath419

Ergodic Theorem in continuous time.

Potential Pathologies

Suppose that for each k you have a sequence

displaymath423

such that all tex2html_wrap_inline425 are independent exponential random variables and tex2html_wrap_inline425 has rate parameter tex2html_wrap_inline429 . We can use these times to make a Markov chain with state space tex2html_wrap_inline431 :

Start the chain in state 1. At time tex2html_wrap_inline433 move to 2, then at tex2html_wrap_inline437 move to 3 and so on. The chain progresses through the states in order 1,2, tex2html_wrap_inline439 . We have

displaymath441

and

displaymath443

Does this define a process?

Depends on tex2html_wrap_inline445 .

Case 1: if tex2html_wrap_inline447 then

displaymath449

(converse to Borel Cantelli) and our construction defines a process X(t) for all t.

Case 2: if tex2html_wrap_inline455 then for each k

displaymath459

In this case put tex2html_wrap_inline461 . Our definition above defines a process X(t) for tex2html_wrap_inline465 . We put tex2html_wrap_inline467 and then begin the process over with the set of holding times tex2html_wrap_inline469 . This defines X for tex2html_wrap_inline473 . Again we put tex2html_wrap_inline475 and continue the process.

Result: X is a Markov Chain with specified transition rates.

Problem: what if we put tex2html_wrap_inline479 and continued?

What if we used probability vector tex2html_wrap_inline481 to pick a value for tex2html_wrap_inline483 and continued?

All yield Markov Processes with the same infinitesimal generator tex2html_wrap_inline353 .

Birth and Death Processes

Consider a population of X(t) individuals. Suppose in next time interval (t,t+h) probability of population increase of 1 (called a birth) is tex2html_wrap_inline491 and probability of decrease of 1 (death) is tex2html_wrap_inline493 .

Jargon: X is a birth and death process.

Special cases:

All tex2html_wrap_inline497 ; called a pure birth process.

All tex2html_wrap_inline499 (0 is absorbing): pure death process.

tex2html_wrap_inline501 and tex2html_wrap_inline503 is a linear birth and death process.

tex2html_wrap_inline505 , tex2html_wrap_inline507 : Poisson Process.

tex2html_wrap_inline509 and tex2html_wrap_inline503 is a linear birth and death process with immigration.

Queuing Theory

Ingredients of Queuing Problem:

1: Queue input process.

2: Number of servers

3: Queue discipline: first come first serve? last in first out? pre-emptive priorities?

4: Service time distribution.

Example: Imagine customers arriving at a facility at times of a Poisson Process N with rate tex2html_wrap_inline515 . This is the input process, denoted M (for Markov) in queuing literature.

Single server case:

Service distribution: exponential service times, rate tex2html_wrap_inline519 .

Queue discipline: first come first serve.

X(t) = number of customers in line at time t.

X is a Markov process called M/M/1 queue:

displaymath529

displaymath531

Example: tex2html_wrap_inline533 queue:

Customers arrive according to PP rate tex2html_wrap_inline515 . Each customer begins service immediately. X(t) is number being served at time t. X is a birth and death process with

displaymath543

and

displaymath545


next up previous



Richard Lockhart
Friday November 3 10:44:51 PST 2000