Skip to content

Files

Latest commit

3fcbb7d · Mar 18, 2019

History

History
199 lines (155 loc) · 6.7 KB

15_Mar_07.md

File metadata and controls

199 lines (155 loc) · 6.7 KB

Note 15 - Mar 07

Review

Positive recurrence is related to the existence of the stationary distribution.

generating function:

ψ ( s ) = E ( s ξ ) = k = 0 P k P ( ξ = k ) , k = 0 , 1 , . . . S k f o r 0 s 1

Properties

  1. ψ ( 0 ) = p 0 , ψ ( 1 ) = k = 0 p k = 1
  2. Generating function determines the distribution $$ p_k=\frac{1}{k!}\frac{d^k\psi(s)}{ds^k}|_{s=0}$$

generating function determines the ditribution.

d k ψ ( s ) d s k = k ! P k + ( . . . ) s + ( . . . ) S 2 + . . .

Since P k 0 for all k = 0 , 1 , . . . , d k ψ ( s ) d s k 0 for all k = 1 , 2 , . . . , s [ 0 , 1 ] .

In particular, ψ ( s ) is increasing and convex between 0 and 1

4. Stochastic Processes (cont'd)

4.6 Generating function and branching processes

Properties of generating function

  1. ψ ( 0 ) = p 0 , ψ ( 1 ) = k = 0 p k = 1

  2. Generating function determines the distribution $$ p_k=\frac{1}{k!}\frac{d^k\psi(s)}{ds^k}|{s=0}$$ Reason: $$ \psi(s)=p_0+p_1s^1+\cdots+p{k-1}s^{k-1}+p_ks^k+p_{k+1}s^{k+1}+\cdots \$$ d k ψ ( s ) d s k = k ! p k + ( ) s + ( ) s 2 + $$ \frac{d^k\psi(s)}{ds^k}|_{s=0}=k!p_k $$ In particular, p 1 0 ψ ( s ) is increasing. p 2 0 ψ ( s ) is climax

  3. Let ξ 1 , . . . , ξ n be independent r.b. with generating function ψ 1 , . . . , ψ n , $$ X=\xi_1+...+\xi_n \Rightarrow \psi_X(s)=\psi_1(s)\psi_2(s)...\psi_n(s)$$ Proof: $$ \begin{aligned} \psi_X(s) &= \mathbb{s^X} \ (independent) &= \mathbb{E}(s^{\xi_1}s^{\xi_2}...s^{\xi_n}) \ &= \mathbb{E}(s^{\xi_1})...\mathbb{E}(s^{\xi_n})\ &= \psi_1(s)...\psi_n(s) \end{aligned} $$

  4. $$\frac{d^\psi(s)}{ds^k}\bigg|{s=1} = \frac{d^k\mathbb{E}(s^\xi)}{ds^k}\bigg|{s=1} = \mathbb{E}(\frac{d^ks^\xi}{ds^k}\bigg|{s=1} = \mathbb{E}(\xi(\xi-1)(\xi-2)...(\xi-k+1)s^{\xi-k})\bigg|{s=1} = \mathbb{E}(\xi(\xi-1)...(\xi-k+1))$$ In particular, E ( ξ ) = ψ ( 1 ) and V a r ( ξ ) = E ( ξ 2 ) ( E ( ξ ) ) 2 = E ( ξ 2 ξ ) + E ( ξ ) ( E ( ξ ) ) 2 = ψ ( 1 ) + ψ ( 1 ) ( ψ ( 1 ) ) 2

    Graph of a g.f.:

4.6.1 Branching Process

Each organism, at the end of its life, produces a random number Y of offsprings.

P ( Y = k ) = P k , k = 0 , 1 , 2 , . . . , P k 0 , k = 0 P k = 1

The number of offsprings of different individuals are independent.

Start from one ancestor X 0 = 1 , X n : # of individuals(population in n -th generation)

Then X n + 1 = Y 1 ( n ) + Y 2 ( n ) + . . . + Y X n ( n ) , where Y 1 ( n ) , . . . , Y X n ( n ) are independent copies of Y , Y i ( n ) is the number of offsprings of the i -th individual in the n -th generation

4.6.1.2 Mean and Variance

Mean: E ( X n ) and Variance: V a r ( X n )

Assume, E ( Y ) = μ , V a r ( Y ) = σ 2 .

E ( X n + 1 ) = E ( Y 1 ( n ) + . . . + Y X n ( n ) ) = E ( E ( Y 1 ( n ) + . . . + Y X n ( n ) | X n ) ) = E ( X n μ ) Wald’s identity(tutorial 3) = μ E ( X n )

X n = μ E ( X n 1 ) = μ 2 X n 2 = μ n E ( X 0 ) = μ n , n = 0 , 1 , . . .

$$ \begin{aligned} Var(X_{n+1}) &= \mathbb{E}(Var(X_{n+1}|X_n)+Var(\mathbb{E})X_{n+1}|X_n) \ \ \begin{aligned}

\mathbb{E}(Var(X_{n+1}|X_n)) &=\mathbb{E}(Var(Y_1^{(n)+...+Y_{X_N}^{(N)}}|X_N))\ &=\mathbb{E}(X_n\cdot\sigma^2) \ &= \sigma^2\mu^n \end{aligned}\quad\quad &\begin{aligned} Var(\mathbb{E}(X_{n+1}|X_n)) &= Var(\mu X_n) \ &= \mu^2Var(X_u)\ \ &\begin{aligned} \Rightarrow &Var(X_{n+1}) = \sigma^2\mu^n+\mu^2Var(X_n))\ &Var(X_1)=\sigma^2 \ &Var(X_2)=\sigma^2\mu + \mu^2\sigma^2=\sigma^2(\mu^1+\mu^2) \ &Var(X_3)=\sigma^2\mu^2+\mu^2(\sigma^2(\mu^1+\mu^2)) = \sigma^2(\mu^2 + \mu^3 + \mu^4)\ &\quad\quad\quad\vdots\ &\text{In general, (can be proved by induction)}\ &\begin{aligned} Var(X_n)&=\sigma(\mu^{n-1}+...+\mu^{2n-2})\ &=\begin{cases} \begin{aligned} &\sigma^2\mu^{n-1}\frac{1-\mu^n}{1-\mu} \quad&\mu\not=1\ &\sigma^2n &\mu=1 \end{aligned} \end{cases} \end{aligned} \end{aligned} \end{aligned} \end{aligned} $$

4.6.1.2 Extinction Probability

Q: What is the probability that the population size is eventually reduced to 0

Note that for a branching process, X n = 0 X k = 0 for all k n . Thus, state 0 is absorbing. ( P 00 = 1 ) . Let N be the time that extinction happens. $$ N=min{n:X_n=0} $$ Define $$U_n=\mathbb{P}(\underbrace{N\leq n}{\text{extinction happens}\atop\text{before or at n}})=\mathbb{P}(X_n=0)$$ Then $U_n$ is increasing in $n$, and $$ \begin{aligned} u=\lim{n\rightarrow\infty}U_n &= \mathbb{P}(N<\infty) \ &= P(\text{the extinction eventually happens}) \ &= \text{extinction probability} \end{aligned} $$ Out goal : find u

We have the following relation between U n and U n 1 : $$ U_n=\sum_{k=0}^\infty P_k(U_{n-1})^k = \underbrace{\psi}{\text{gf of Y}}(U{n-1}) $$

Each subpopulation has the same distribution as the whole population.

Total population dies out in n steps if and only if each subpopulation dies out int n 1 steps

$$ \begin{aligned} U_n &= \mathbb{P}(N\leq n) \ &= \sum_k \mathbb{P}(N\leq n|X_1 = k)\underbrace{\mathbb{P}(X_1=k)}{=P_k} \ &=\sum_k \mathbb{P}(\underbrace{N_1\leq n-1}{\text{# of steps for subpopulation 1 to die out}},\cdots, N_k\leq n-1|X_1=k)\cdot P_k \ &= \sum_k P_k\cdot U_{n-1}^k \ &= \mathbb{E}(U_{n-1}^Y) \ &= \psi(U_{n-1}) \end{aligned} $$

Thus, the question is :

With initial value U 0 = 0 (since X 0 = 1 ), relation U n = ψ ( U n 1 ) . What is lim n U N = u ?

Recall that we have

  1. ψ ( 0 ) = P 0 0
  2. ψ ( 1 ) = 1
  3. ψ ( s ) is increasing
  4. ψ ( s ) is convex

Draw ψ ( s ) and function f ( s ) = s between 0 and 1, we have two cases:

The extinction probability u will be the smallest intersection of ψ ( s ) and f ( s ) . Equivalently, it is the smallest solution of the equation ψ ( s ) = s between 0 and 1.