Stochastic Process Cheat Sheet

By Z.H. Fu
切问录 www.fuzihao.org

Poisson Process

counting process independent increments: if the numbers of events that occur in disjoint time intervals are independent. \(P(N(s+t)-N(s)|N(s))=P(N(s+t)-N(s))\),\(P(X_i|X_j)=P(X_i)\)

counting process stationary increments: if the distribution of the number of events that occur in any interval of time depends only on the length of the time interval.

Define 1

The counting process \(\{N(t), t\ge 0\}\) is said to be a Poisson process having rate \(\lambda, \lambda > 0\), if

  1. \(N(0)= 0\);
  2. The process has independent increments;
  3. The number of events in any interval of length \(t\) is Poisson distributed with mean \(\lambda t\) Thatis,for all \(s,t\ge0\), \(P\{N(t+s)- N(s)=n\}= e^{-\lambda t}\frac{(\lambda t)^n}{n!}, \ \ n=0,1,\cdots\) From (iii), we have: \(E[N(t)] = \lambda t\)

Define 2

  1. \(N(0)= 0\)
  2. The process has stationary and independent increments.
  3. $P{N(h) = 1} = h + o(h) $
  4. \(P\{N(h) > 2\} = o(h)\)

Interarrival

\(X_n\)denote the time between the \((n- 1)\)st and the \(n\)th event. \(P\{X_1>t\}=P\{N(t) =0\}=e^{-\lambda t}\), \(P\{X_2>t|X_1=s\}=e^{-\lambda t}\). \(X_i\) i.i.d, mean=\(1/\lambda\) r.v. \(S_n\) is the arrival time of the \(n\)th event, \(S_n=\sum_{i=1}^nX_i,n\ge1\). \(S_n\) has a gamma distribution with parameters \(n\) and \(\lambda\),\(P\{S_n\le t \} =P\{ N( t ) \ge n\}=\sum_{j=n}^\infty e^{-\lambda t}\frac{(\lambda t)^j}{j!} \Rightarrow f(t)=\) \(\lambda e^{-\lambda t}\frac{(\lambda t)^{n-1}}{(n-1)!},t\ge0\)

Conditional Distribution of the Arrival Times

Known at \(t\), one event happen, give the distribution of the time that the event happen. It’s uniform dist.\(P\{X_1<s|N(t) = 1\}=s/t\) Generalization: Given that \(N(t) = n\), the \(n\) arrival times \(S_1,...,S_n\), have the same distribution as the order statistics corresponding to n independent random variables uniformly distributed on the interval \((0,t)\) \(f(t_1,...,t_n)=P\{t_i\le S_i \le t_i+h,i=1,...,n|N(t)=n\}/(h_1h_2\cdots h_n)=n!/t^n\)

Nonhomogeneous

  1. \(N(0) = 0\)
  2. \(\{N(t),t \ge 0\}\) has independent increments
  3. \(P\{N(t +h) - N(t) \ge 2\}= o(h)\)
  4. \(P\{N(t+h)- N(t)=1\}=\lambda(t)h+o(h)\) let \(m(t)=\int_0^t\lambda(s)ds\), we have \(P\{N(t+s)-N(t)=n\}=\) \(\exp\{-(m(t+s)-m(t))\}[m(t+s)-m(t)]^n/n!\). The form is substitute \(\lambda t\) with \(m(t+s)-m(t)\). We can think of the non-homo­geneous process as being a random sample from a homogeneous Poi sson process with probability \(\frac{\lambda(t)}{\lambda}\).

Compound Poisson Random Variables

\(W=\sum_{i=1}^NX_i,N\sim poisson(\lambda)\), generating function of \(W\) is \(E[e^{tW}]=\) \(\sum_{n=0}^\infty E[e^{tW}|N=n]P\{N=n\}=\) \(\exp\{\lambda(\phi_X(t)-1)\},\phi_X(t)=\) \(E[e^{tX_i}]\) \(E[W]=\lambda E[X],Var(W)=\lambda E[X^2]\) property: \(E[Wh(W)]=\lambda E[Xh(W+X)]\), we can get: \(E[W]=\lambda E[X],E[W^2]=\) \(\lambda E[X^2]+\lambda^2(E[X])^2,E[W^3]=\) \(\lambda E[X^3]+3\lambda ^2E[X]E[X^2]+\lambda^3(E[X])^3\)

Markov Chains

\(P_{ij}\) from \(i\) to \(j\). \(\sum_{j=0}^\infty P_{ij}=1\)

Chapman-Kolmogorov Equations

\(P_{ij}^n=P\{X_{n+m}=j|X_m=i\}, n\ge 0, i,j \ge 0\). We have \(P_{ij}^{n+m}=\) \(\sum_{k=0}^\infty P_{ik}^nP_{kj}^m\) in matrix form \(P^{(n+m)}=P^{(n)}P^{(m)}\). Proof: \(P_{ij}^{n+m}=\) \(P\{X_{n+m}=j|X_0=i\}\) \(=\sum_{k=0}^\infty P\{X_{n+m}=j,X_n=k|X_0=i\}\) \(=\sum_{k=0}^\infty P\{X_{n+m}=j|X_n=k,X_0=i\}P\{X_{n}=k|X_0=i\}\) \(=...\)

accessible : \(j\) is accessible from state \(i\) if \(\exists n\ge 0, s.t. P_{ij}^n>0\).

communicate: states \(i\) and \(j\) accessible to each other. (i)\(i \leftrightarrow i\); (ii)\(if i \leftrightarrow j, then j \leftrightarrow i\); (iii) \(if i \leftrightarrow j and j \leftrightarrow k, then i \leftrightarrow k\)(Proof: \(P_{ij}^{n+m}=\sum_{k=0}^\infty P_{ik}^nP_{kj}^m\) \(\ge P_{ij}^nP_{jk}^m>0\)).

class : Two states that communicate are said to be in the same class.

irreducible : if there is only one class in a markov chain.

period: State \(i\) is said to have period \(d\) if \(P_{ii}^n=0\) when ever \(n\) is not divisible by \(d\) and \(d\) is the greatest integer with this property. If \(i \leftrightarrow j\), then \(d(i)=d(j)\). Perid is a class property.

aperiodic: A state with period 1 is said to be aperiodic.

first passage time:\(f_{ij}^n\) to be the probability that, starting in \(i\), the first transition into \(j\) occurs at time \(n\). \(f_{ij}=\sum_{n=1}^\infty f_{ij}^n\).

In recursive form: \(f_{ij}^n=\sum_{k\ne j}p_{ik}f_{kj}^{(n-1)}\).

If \(\sum_{n=1}^\infty f_{ij}^{(n)}<1\), means there exists probability that start from \(i\) and never arrive at \(j\). Denote the expected first passage time from \(i\) to \(j\) as \(\mu_{ij}=\) \(\infty (if \sum_{n=1}^\infty f_{ij}^{(n)}<1);\) \(=\sum_{n=1}^\infty nf_{ij}^{(n)}(if \sum_{n=1}^\infty f_{ij}^{(n)}=1)\)

recursive form: \(\mu_{ij}=1+\sum_{k\ne j}p_{ik}\mu_{kj}\)

recurrent: \(f_{jj}=1\),

transient otherwise. \(j\) is recurrent iff \(\sum_{n=1}^\infty P_{jj}^n=\infty\). recurrent implies that the visit time to the node is infinity: \(E[\text{number of visit to j}|X_0=j]=\infty\). In a finite-state Markov chain not all states can be transient. If \(i \leftrightarrow j\) and \(i\) is recurrent, then j is recurrent. Same with transient.

If \(i \leftrightarrow j\) and \(j\) is recurrent, then \(f_{ij}=1\).

transient: iff there exists a state \(j ( j \ne i)\) that is accessible from state \(i\) but not vice versa, that is, state \(i\) is not accessible from state \(j\).

upon entering this state, the process might never return to this state again. Transient is a class property.

recurrent: not transient. is a class property.

absorbing state: A state that is never left once the process enters that state.

Limit Theorems

\(\mu_{jj}\) denote the expected number of transitions needed to return to state j. \(\mu_{jj}=\infty (\text{j is transient});\) \(=\sum_{n=1}^\infty nf_{jj}^n (\text{j is recurrent})\) \(\pi_j=\lim_{n\rightarrow \infty}P_{jj}^{nd(j)}\) \(\mu_{ii}=1/\pi_i\) if \(i\) and \(j\) communicate, then

  1. \(P\{\lim_{t\rightarrow \infty}N_j(t)/t=1/\mu_{jj}|X_0=i\}=1\) (ii)\(\lim_{n\rightarrow\infty}\sum_{k=1}^nP_{ij}^k/n=1/\mu_{jj}\)
  2. If \(j\) is aperiodic, then \(\lim_{n\rightarrow \infty}P_{ij}^n=1/\mu_{jj}\)
  3. If \(j\) has period d, then \(\lim_{n\rightarrow \infty}P_{ij}^{nd}=d/\mu_{jj}\)

positive recurrent : \(\mu_{jj}<\infty\). Is a class property. \(\pi_j>0\)

null recurrent :\(\mu_{jj}=\infty\). Is a class property.\(\pi_j=0\)

ergodic: positive recurrent states that are aperiodic are called ergodic states.

stationary: A probability distribution \(\{P_j, j \ge 0\}\) is said to be stationary for the Markov chain if \(P_j=\sum_{i=0}^\infty P_iP_{ij},j\ge 0\). If the initial probability distribution is the stationary distribution, then \(X_n\) will have the same distribution for all \(n\).

Continuous Time Markov Chains

Markovian property

  1. indepent of previous states: \(P\{X(t+s)=j|X(s)=i,X(r)=x(r)\}=P\{X(t+s)=j|X(s)=i\}\) \(\forall r<s,t>0\)

  2. Independent of initial time \(P\{X(t+s)=j|X(s)=i\}=P\{X(t)=j|X(0)=i\}\) If, \(P{ X( t + s ) = j | X( s) = i}\) is independent of \(s\), then the continuous-time Markov chain is said to have stationary or homogeneous transition probabilities.

transition probability function \(p_{ij}(t)=P\{X(t)=j|X(0)=i\}\), \(\lim_{t\rightarrow 0}p_{ij}(t)=1(i=j);=0(i\ne j)\)

Time staying at some state By Markovian, $P{T_i>t+s|T_i>s}=P{T_i>t} $ \(\Rightarrow P\{T_i\le t\}=1-e^{-q_it}\), \(q_i=1/E[T_i]\) is leaving times in unit time,denote \(q_{ij}\) the times leaving i and enter j,we have \(q_i=\sum_{j\ne i}q_{ij}\).

Chapman-Kolmogorov \(p_{ij}(t)=\sum_{k=0}^M p_{ik}(s)p_{kj}(t-s)\)

stead state \(\lim_{t\rightarrow\infty}p_{ij}(t)=\pi_j\), \(\pi_j=\sum_{k=0}^M\pi_ip_{ij}(t)\),Usually use a more easy equations:\(\pi_jq_j=\sum_{i\ne j}\pi_i q_{ij},\sum_{j=0}^M\pi_j=1\)

Martingales

Martingales: \(\{Z_n,n\ge1\}\) is a martingale if \(E[|Z_n|]<\infty,\forall n\) and \(E[Z_{n+1}|Z_1,Z_2,\cdots,Z_n]=Z_n\).

take expectation get: \(E[Z_{n+1}]=E[Z_n]\) then \(E[Z_n]=E[Z_1],\forall n\)

Conditional Expectation: \(E(aY_1+bY_2|F_n)=aE(Y_1|F_n)+bE(Y_2|F_n)\); If \(Y\) is the function about \(X_1,\ldots,X_n\),then \(E[Y|F_n]=Y\);if \(Y\) is an arbitary r.v. ,\(Z\) is a measurable r.v. about \(X_1,\ldots,X_n\),then \(E[YZ|F_n]=ZE[Y|F_n]\)

0 mean independent r.v. sum is martingale: \(X_i\) is independent r.v. with 0 mean; \(Z_n=\sum_{i=1}^nX_i\) is a martingale. Since \(E[Z_{n+1}|Z_1,\cdots,Z_n]\) \(=E[Z_n+X_{n+1}|Z_1,\cdots,Z_n]\) \(=E[Z_n|Z_1,\cdots,Z_n]+E[X_{n+1}|Z_1,\cdots,Z_n]\) \(=Z_n+E[X_{n+1}]=Z_n\),note: \(Z_n=E[Z_n|Z_1,\cdots,Z_n],\because Z_n\) is given

1 mean independent r.v. product is martingale: \(X_i\) is independent r.v. with 1 mean; \(Z_n=\prod_{i=1}^nX_i\) is a martingale. Since \(E[Z_{n+1}|Z_1,\cdots,Z_n]\) \(=E[Z_nX_{n+1}|Z_1,\cdots,Z_n]\) \(=Z_nE[X_{n+1}|Z_1,\cdots,Z_n]\) \(=Z_nE[X_{n+1}]=Z_n\)

Queueing Theory

Distribution of interarrival times/Distribution of service times/Number of servers Example: M/M/s,M/G/1。M=exponential distribution (Markovian);D=degenerate distribution (constant times);\(E_k\)=Erlang distribution (shape parameter k);G=general distribution (any arbitrary distribution allowed)

Stochastic Process Models

M/G/1 Queue

Interarrival times is exponential. let \(X_n\) denote the number of customers left behind by the \(n\)th depar­ture. Also, let \(Y_n\) denote the number of customers arriving during the service period of the \((n + 1)\)st customer. We have \(X_{n+1}=X_n-1+Y_n (X_n > 0);=Y_n (X_n=0)\). \(P\{Y_n=j\}=\int_0^\infty e^{-\lambda x}(\lambda x)^j/j!dG(x)\). \(\Rightarrow\) \(P_{0j}=\int_0^\infty e^{-\lambda x}(\lambda x)^j/j!dG(x)\). \(P_{ij}=\int_0^\infty e^{-\lambda x}(\lambda x)^{j-i+1}/(j-i+1)!dG(x)\),\((j\ge i-1, i\ge 1)\), $P_{ij}=0 $(otherwise).

G/M/1 Queue

Let \(X_n\) denote the number of customers in the system as seen by the \(n\)th arrival. \(P_{i,i+1-j}=\int_0^\infty e^{-\mu t}(\mu t)^j/j!dG(x)\), \(j=0,\cdots,i\). \(P_{i0}=\int_0^\infty \sum_{k=i+1}^\infty e^{-\mu t}(\mu t)^k/k!dG(x), i\ge 0\)

Gambler’s Ruin Problem

\(p_{0,0}=p_{N,N}=1,p_{i,i+1}=p=1-p_{i,i-1}\) Solution: Let \(f_i\) denote the probability that starting with i, the fortune will eventually reach N. We obtain: \(f_i=pf_{i+1}+qf_{i-1}\) \(\Rightarrow f_{i+1}-f_i=q/p(f_i-f_{i-1}),\ldots, f_N-f_{N-1}=(q/p)^{N-1}f_1\) \(\Rightarrow f_i-f_1=f_1[q/p+(q/p)^2+\cdots+(q/p)^{i-1}]\) \(\Rightarrow f_i=(1-(q/p)^i)/(1-(q/p)^N),if p\ne 1/2;=i/N elif p=1/2\). The probability of stop at 0 is \(1-f_i\)

Alarm Clock Problem

n clock are exponential distributed with parameters : \(b_1,\ldots,b_n\)\(P(T_i>t)=\exp(-b_i t)\). R.v. \(T=\min\{T_1,\ldots,T_n\}\) \(P(T\ge t)=P(T_1\ge t)P(T_2\ge t)\cdots P(T_n\ge t)=\exp(-(b_1+\cdots+b_n)t)\)

Probability that \(T_1\) ring first :\(P(T_1=T)=\int_0^\infty P(T_2>t,\cdots,T_n>t)P(T_1=t)dt\) \(=\int_0^\infty \exp(-(b_2+\cdots+b_n)t)b_1\exp(-b_1 t)dt=b_1/(b_1+\cdots+b_n)\)

Absolute Value of Simple Random Walk

\(S_n=\sum_1^nX_n\), \(P\{X_i =1\}=p\), \(P\{X_i =-1\}=1-p=q\). Let \(j=\max\{k:0\le k\le n,i_k=0\}\), \(\therefore P\{S_n=i||S_n|=i,\cdots,|S_1|=i_1\}=\) \(P\{S_n=i||S_n|=i,\cdots,|S_j|=0\}\), \(\therefore P(S_n=i|...)=p^{(n-j)/2+i/2}q^{(n-j)/2-i/2}\) \(\therefore P(S_n=-i|...)=p^{(n-j)/2-i/2}q^{(n-j)/2+i/2}\) \(P(S_n=i|...)=P(S_n=i|...)/(P(S_n=i|...)+P(S_n=-i|...))\) \(=p^i/(p^i+q^i)\) \(\therefore P\{|S_{n+1}|=i+1||S_n|=i,|S_{n-1},\cdots,|S_1|\}\) \(=P\{S_{n+1}=i+1|S_n=i\}p^i/(p^i+q^i)\) \(+P\{S_{n+1}=-(i+1)|S_n=-i\}q^i/(p^i+q^i)\) \(=(p^{i+1}+q^{i+1})/(p^i+q^i)\) \(\therefore P_{i,i+1}=(p^{i+1}+q^{i+1})/(p^i+q^i)\) \(=1-P_{i,i-1},i>0\) \(P_{01}=1\)

Simple Random Walk

\(P_{i,i+1}=p=1-P_{i,i-1}\). Firstly, step number cannot be even, i.e. \(P_{00}^{2n+1}=0\). \(P_{00}^{2n}=\binom{2n}{n}p^n(1-p)^n\) \(=\frac{(2n)!}{n!n!}(p(1-p))^n\). By Stirling, \(P_{00}^{2n}\sim (4p(1-p))^n/\sqrt{\pi n}\). $$ if \(\sum_{n=1}^\infty (4p(1-p))^n/\sqrt{\pi n}=\infty\) (only when \(p=1/2\)) it will be recurrent. otherwise transient.