next up previous


Postscript version of this file

STAT 380 Week 7

Properties of Poisson Processes

1)
If $ N_1$ and $ N_2$ are independent Poisson processes with rates $ \lambda_1$ and $ \lambda_2$, respectively, then $ N=N_1+N_2$ is a Poisson process with rate $ \lambda_1+\lambda_2$.

2)
Suppose $ N$ is a Poisson process with rate $ \lambda$. Suppose each point is marked with a label, say one of $ L_1,\ldots,L_r$, independently of all other occurences. Suppose $ p_i$ is the probability that a given point receives label $ L_i$. Let $ N_i$ count the points with label $ i$ (so that $ N=N_1+\cdots+N_r$). Then $ N_1,\ldots,N_r$ are independent Poisson processes with rates $ p_i\lambda$.

3)
Suppose $ U_1, U_2,\ldots$ independent rvs, each uniformly distributed on $ [0,T]$. Suppose $ M$ is a Poisson $ (\lambda T)$ random variable independent of the $ U$'s. Let

$\displaystyle N(t) = \sum_1^M 1(U_i \le t)
$

Then $ N$ is a Poisson process on $ [0,T]$ with rate $ \lambda$.

4)
Suppose $ N$ is a Poisson process with rate $ \lambda$. Let $ S_1 < S_2 < \cdots$ be the times at which points arrive Given $ N(T)=n$ $ S_1,\ldots,S_n$ have the same distribution as the order statistics of a sample of size $ n$ from the uniform distribution on $ [0,T]$.

5)
Given $ S_{n+1}=T$, $ S_1,\ldots,S_n$ have the same distribution as the order statistics of a sample of size $ n$ from the uniform distribution on $ [0,T]$.

Indications of some proofs:

1) $ N_1,\ldots,N_r$ independent Poisson processes rates $ \lambda_i$, $ N=\sum N_i$. Let $ A_h$ be the event of 2 or more points in $ N$ in the time interval $ (t,t+h]$, $ B_h$, the event of exactly one point in $ N$ in the time interval $ (t,t+h]$.

Let $ A_{ih}$ and $ B_{ih}$ be the corresponding events for $ N_i$.

Let $ H_t$ denote the history of the processes up to time $ t$; we condition on $ H_t$.

We are given:

$\displaystyle P(A_{ih}\vert H_t) =o(h)
$

and

$\displaystyle P(B_{ih}\vert H_t) = \lambda_i h+o(h)\, .$

Note that

$\displaystyle A_h \subset \bigcup_{i=1}^r A_{ih} \cup \bigcup_{i \neq j} \left(B_{ih}\cap
B_{jh}\right)
$

Since

$\displaystyle P(B_{ih}\cap B_{jh}\vert H_t)$ $\displaystyle = P(B_{ih}\vert H_t) P(B_{jh}\vert H_t)$    
  $\displaystyle = (\lambda_i h +o(h))(\lambda_j h+o(h))$    
  $\displaystyle = O(h^2)$    
  $\displaystyle = o(h)$    

we have checked one of the two infinitesimal conditions for a Poisson process.

Next let $ C_h$ be the event of no points in $ N$ in the time interval $ (t,t+h]$ and $ C_{ih}$ the same for $ N_i$. Then

$\displaystyle P(C_h\vert H_t)$ $\displaystyle = P(\cap C_{ih}\vert H_t)$    
  $\displaystyle = \prod P(C_{ih}\vert H_t)$    
  $\displaystyle = \prod (1-\lambda_ih +o(h))$    
  $\displaystyle = 1-(\sum \lambda_i)h + o(h)$    

shows

$\displaystyle P(B_h\vert H_t)$ $\displaystyle = 1-P(C_h\vert H_t) -P(A_h\vert H_t)$    
  $\displaystyle = (\sum \lambda_i) h +o(h)$    

Hence $ N$ is a Poisson process with rate $ \sum \lambda_i$.

2) The infinitesimal approach used for 1 can do part of this. See text for rest. Events defined as in 1): The event $ B_{ih}$ that there is one point in $ N_i$ in $ (t,t+h]$ is the event, $ B_h$ that there is exactly one point in any of the $ r$ processes together with a subset of $ A_h$ where there are two or more points in $ N$ in $ (t,t+h]$ but exactly one is labeled $ i$. Since $ P(A_h\vert H_t)=o(h)$

$\displaystyle P(B_{ih}\vert H_t)$ $\displaystyle = p_iP(B_{h}\vert H_t)+o(h)$    
  $\displaystyle = p_i (\lambda h+o(h)) + o(h)$    
  $\displaystyle = p_i \lambda h + o(h)$    

Similarly, $ A_{ih}$ is a subset of $ A_h$ so

$\displaystyle P(A_{ih}\vert H_t) =o(h)
$

This shows each $ N_i$ is Poisson with rate $ \lambda p_i$. To get independence requires more work; see the text for the algebraic method which is easier.

3) Fix $ s<t$. Let $ N(s,t)$ be the number of points in $ (s,t]$. Given $ N=n$ the conditional distribution of $ N(s,t)$ is Binomial$ (n,p)$ with $ p=(t-s)/T$. So

$\displaystyle P(N(s$ $\displaystyle ,t)=k)$    
  $\displaystyle = \sum_{n=k}^\infty P(N(s,t)=k,N=n)$    
  $\displaystyle = \sum_{n=k}^\infty P(N(s,t)=k\vert N=n) P(N=n)$    
  $\displaystyle = \sum_{n=k}^\infty \frac{n!}{k!(n-k)!} p^k (1-p)^{n-k} \frac{(\lambda T)^n}{n!} e^{-\lambda T}$    
  $\displaystyle = \frac{e^{-\lambda T}}{k!} (\lambda T p)^k\sum_{n=k}^\infty \frac{(1-p)^{n-k} (\lambda T)^{n-k}}{(n-k)!}$    
  $\displaystyle = \frac{e^{-\lambda T}}{k!} (\lambda T p)^k \sum_{m=0}^\infty(1-p)^m(\lambda T)^m/m!$    
  $\displaystyle = \frac{e^{-\lambda T}}{k!} (\lambda T p)^ke^{\lambda T(1-p)}$    
  $\displaystyle = \frac{e^{-\lambda(t-s)}(\lambda(t-s))^k}{k!}$    

4): Fix $ s_i,h_i$ for $ i=1,\ldots, n$ such that

$\displaystyle 0 < s_1 < s_1+h_1 < s_2 < \cdots < s_n<s_n+h_n< T
$

Given $ N(T)=n$ we compute the probability of the event

$\displaystyle A=\bigcap_{i=1}^n \{s_i < S_i < s_i+h_i\}
$

Intersection of $ A$, $ N(T)=n$ is ($ s_0=h_0=0$):

\begin{multline*}
B \equiv \\
\bigcap_{i=1}^n \{N(s_{i-1}+h_{i-1},s_i]=0,N(s_i,s_i+h_i]=1\}
\\
\cap \{N(s_n+h_n,T]=0\}
\end{multline*}

whose probability is

$\displaystyle \left(\prod \lambda h_i\right) e^{-\lambda T}
$

So

$\displaystyle P(A\vert N(t)=n)$ $\displaystyle = \frac{P(A,N(T)=n)}{P(N(T)=n)}$    
  $\displaystyle = \frac{ \lambda^n e^{-\lambda T} \prod h_i}{(\lambda T)^n e^{-\lambda T} / n!}$    
  $\displaystyle = \frac{n!\prod h_i}{T^n}$    

Divide by $ \prod h_i$ and let all $ h_i$ go to 0 to get joint density of $ S_1,\ldots,S_n$ is

$\displaystyle \frac{n!}{T^n} 1(0 < s_1 < \cdots < s_n <T)
$

which is the density of order statistics from a Uniform$ [0,T]$ sample of size $ n$.

5) Replace the event $ S_{n+1}=T$ with $ T < S_{n+1}< T+h$. With $ A$ as before we want

\begin{multline*}
P(A\vert T < S_{n+1}< T+h)
\\
= \frac{ P(B,N(T,T+h] \ge 1)}{P(T < S_{n+1}< T+h)}
\end{multline*}

Note that $ B$ is independent of $ \{N(T,T+h] \ge 1\}$ and that we have already found the limit

$\displaystyle \frac{P(B)}{ \prod h_i} \to \lambda^n e^{-\lambda T}
$

We are left to compute the limit of

$\displaystyle \frac{P(N(T,T+h] \ge 1)}{P(T < S_{n+1}< T+h)}
$

The denominator is

\begin{multline*}
\sum_{k=0}^n P(N(0,T]=k,N(T,T+h]=n+1-k)
\\
+o(h) = P(N(0,T]=n)\lambda h+o(h)
\end{multline*}

Thus

$\displaystyle \frac{P(N(T,T+h] \ge 1)}{P(T < S_{n+1}< T+h)}
$ $\displaystyle = \frac{\lambda h+o(h)}{\frac{(\lambda T)^n}{n!} e^{-\lambda T} \lambda h+o(h)}$    
  $\displaystyle \to \frac{n!}{(\lambda T)^ne^{-\lambda T}}$    

This gives the conditional density of $ S_1,\ldots,S_n$ given $ S_{n+1}=T$ as in 4).

Inhomogeneous Poisson Processes

The idea of hazard rate can be used to extend the notion of Poisson Process. Suppose $ \lambda(t) \ge 0$ is a function of $ t$. Suppose $ N$ is a counting process such that

$\displaystyle P(N(t+h)=k+1\vert N(t)=k,H_t)= \lambda(t) h +o(h)
$

and

$\displaystyle P(N(t+h) \ge k+2\vert N(t)=k,H_t)= o(h)
$

Then $ N$ has independent increments and $ N(t+s)-N(t)$ has a Poisson distribution with mean

$\displaystyle \int_t^{t+s} \lambda (u) du
$

If we put

$\displaystyle \Lambda(t) = \int_0^t
\lambda(u) du
$

then mean of $ N(t+s)-N(t)$ is $ \Lambda(t+s)-\Lambda(t)$.

Jargon: $ \lambda$ is the intensity or instaneous intensity and $ \Lambda$ the cumulative intensity.

Can use the model with $ \Lambda$ any non-decreasing right continuous function, possibly without a derivative. This allows ties.

Compound Poisson Processes

Imagine insurance claims arise at times of a Poisson process, $ N(t)$, (more likely for an inhomogeneous process).

Let $ Y_i$ be the value of the $ i$th claim associated with the point whose time is $ S_i$.

Assume that the $ Y$'s are independent of each other and of $ N$.

Let

$\displaystyle \mu = {\rm E}(Y_i)$    and $\displaystyle \sigma^2 = {\rm var}(Y_i)
$

Let

$\displaystyle X(t) = \sum_{i=1}^{N(t)} Y_i
$

be the total claim up to time $ t$. We call $ X$ a compound Poisson Process.

Useful properties:

$\displaystyle {\rm E}\left\{X(t)\vert N(t)\right\}$ $\displaystyle = N(t) \mu$    
$\displaystyle {\rm Var}\left\{X(t)\vert N(t)\right\}$ $\displaystyle = N(t)\sigma^2$    
$\displaystyle {\rm E}\left\{X(t)\right\}$ $\displaystyle = \mu {\rm E}\left\{N(t)\right\}$    
  $\displaystyle = \mu \lambda t$    
$\displaystyle {\rm Var}\left\{X(t)\right\}$ $\displaystyle = {\rm Var}\left[{\rm E}\left\{X(t)\vert N(t)\right\}\right]$    
  $\displaystyle \qquad +{\rm E}\left[{\rm Var}\left\{X(t)\vert N(t)\right\}\right]$    
  $\displaystyle = \lambda t \mu^2 + \lambda t \sigma^2$    

(Look at all familiar? See homework.)


next up previous



Richard Lockhart
2002-02-07