Limit Theorem for markov chains n=base at generationn Pi=P(Sn+1=jiN=i) Pij >0 for all i,j(and ∑P=1fora0 then there is a unique vector r such that r=rp and ling pn=r for any prob. vector q n→>00 r is called the"stationary"or"limiting" distribution of P See Ch 4, Taylor Karlin, An Introduction to Stochastic Modeling, 1984 for details
Limit Theorem for Markov Chains Sn = base at generation n Pij = P ( Sn +1 = j |Sn =i ) If Pij >0 for all i,j (and ∑ Pij =1 for all i) j G then there is a unique vector P n G r P G r r such that G q G r G = and lim = (for any prob. vector q ) n → ∞ G r is called the “stationary” or “limiting” distribution of P See Ch. 4, Taylor & Karlin, An Introduction to Stochastic Modeling, 1984 for details
Stationary Distribution Examples 2-letter alphabet: R=purine, Y=pyrimidine Stationary distributions for: pp 0<p<1 0<p<1,0<q q
Stationary Distribution Examples 2-letter alphabet: R = purine, Y = pyrimidine Stationary distributions for: ⎛ 1 0⎞ ⎛ 0 1 ⎞ I = ⎜ ⎟ Q = ⎜ ⎟ ⎝ 0 1⎠ ⎝ 1 0⎠ ⎛1 − p p ⎞ P = ⎝⎜ p 1 − p⎠⎟ 0 < p < 1 ⎛1 − p p ⎞ P′ = 0 < p < 1, 0 < q < 1 ⎝⎜ q 1 − q⎠⎟
How are mutation rates measured?
How are mutation rates measured?
How does entropy change when a Markov transition matrix is applied? If limiting distribution is uniform, then entropy increases (analogous to 2nd Law of Thermodynamics However, this is not true in general (why not
How does entropy change when a Markov transition matrix is applied? If limiting distribution is uniform, then entropy increases (analogous to 2nd Law of Thermodynamics) However, this is not true in general (why not?)
How rapidly is the stationary distribution approached?
How rapidly is the stationary distribution approached?