Stochastic Process {Xt|t∈T Xt∈2 time f state space state x∈2 ●discrete time: T is countable T={0,1,2,.… ● discrete space: is finite or countably infinite X0,X1,X2,…
Stochastic Process • discrete time: • discrete space: {Xt | t 2 T} time t state space Ω Xt 2 ⌦ state x 2 ⌦ T is countable T = {0, 1, 2,...} Ω is finite or countably infinite X0, X1, X2
Markov Property dependency structure of Xo,X1,X2,... ●Markov property: (memoryless) X+1 depends only on X: ∀t=0,1,2,.,c0,c1,.,xt-1,x,y∈2 Pr[Xt+1=y Xo =Jo;...,Xt-1=xt-1,Xt= =PrX:+1=y Xi=x Markov chain:discrete time discrete space stochastic process with Markov property
Markov Property • dependency structure of • Markov property: X0, X1, X2,... (memoryless) Xt+1 depends only on Xt Pr[Xt+1 = y | X0 = x0,...,Xt1 = xt1, Xt = x] = Pr[Xt+1 = y | Xt = x] 8t = 0, 1, 2,..., 8x0, x1,...,xt1, x, y 2 ⌦ Markov chain: discrete time discrete space stochastic process with Markov property
Transition Matrix Markov chain:Xo,X1,X2,... Pr[X+1=y Xo=o,...,Xt-1=t-1,Xt=] =PrX+1=y|X&=四=P )=Pcy (time-homogenous) P y∈D 。 x∈2 stochastic matrix P1=1
Transition Matrix Pr[Xt+1 = y | X0 = x0,...,Xt1 = xt1, Xt = x] = Pr[Xt+1 = y | Xt = x] Markov chain: X0, X1, X2,... = P(t) xy = Pxy P y 2 ⌦ x 2 ⌦ Pxy (time-homogenous) stochastic matrix P1 = 1
chain: X0,X1,X2,. distribution: π(o)π(1四)T2)∈[0,12 ∑π=1 π=Pr[Xt=x π(t+1)=π()P 0=PK+1= =>Pr[X:x]Pr[X+1==] x∈2 =∑Pw c∈2 =(πP)g
X0, X1, X2, ... (0) (1) (2) (t+1) = (t) P chain: distribution: 2 [0, 1]⌦ X x2⌦ ⇡x = 1 ⇡(t) x = Pr[Xt = x] ⇡(t+1) y = Pr[Xt+1 = y] = X x2⌦ Pr[Xt = x] Pr[Xt+1 = y | Xt = x] = X x2⌦ ⇡(t) x Pxy = (⇡(t) P)y
π(o)Pπ(1)P.…π()Pπ(t+1)P.… ●initial distribution:π(o) ●transition matrix:P π(t)=π(o)Pt Markoy chain:t =(P)
• initial distribution: • transition matrix: P (0) ⇡(0) P ! ⇡(1) P ! ······ ⇡(t) P ! ⇡(t+1) P ! ··· ⇡(t) = ⇡(0)Pt Markov chain: M = (⌦, P)