Random Walk
Random Walk
Markov Property dependency structure of Xo,X1,X2,... ●Markov property: (memoryless) X+1 depends only on X: ∀t=0,1,2,.,c0,c1,.,xt-1,x,y∈2 Pr[Xt+1=y Xo =Jo;...,Xt-1=xt-1,Xt= =PrX:+1=y Xi=x Markov chain:discrete time discrete space stochastic process with Markov property
Markov Property • dependency structure of • Markov property: X0, X1, X2,... (memoryless) Xt+1 depends only on Xt Pr[Xt+1 = y | X0 = x0,...,Xt1 = xt1, Xt = x] = Pr[Xt+1 = y | Xt = x] 8t = 0, 1, 2,..., 8x0, x1,...,xt1, x, y 2 ⌦ Markov chain: discrete time discrete space stochastic process with Markov property
Transition Matrix Markov chain:Xo,X1,X2,... Pr[X+1=y Xo=o,...,Xt-1=t-1,Xt=] =PrX+1=y|X&=四=P )=Pcy (time-homogenous) P y∈D 。 x∈2 stochastic matrix P1=1
Transition Matrix Pr[Xt+1 = y | X0 = x0,...,Xt1 = xt1, Xt = x] = Pr[Xt+1 = y | Xt = x] Markov chain: X0, X1, X2,... = P(t) xy = Pxy P y 2 ⌦ x 2 ⌦ Pxy (time-homogenous) stochastic matrix P1 = 1
chain: X0,X1,X2,. distribution: π(o)π(1四)T2)∈[0,12 ∑π=1 π=Pr[Xt=x π(t+1)=π()P 0=PK+1= =>Pr[X:x]Pr[X+1==] x∈2 =∑Pw c∈2 =(πP)g
X0, X1, X2, ... (0) (1) (2) (t+1) = (t) P chain: distribution: 2 [0, 1]⌦ X x2⌦ ⇡x = 1 ⇡(t) x = Pr[Xt = x] ⇡(t+1) y = Pr[Xt+1 = y] = X x2⌦ Pr[Xt = x] Pr[Xt+1 = y | Xt = x] = X x2⌦ ⇡(t) x Pxy = (⇡(t) P)y
1 1 1/3 2 © 1/3 3 3 2/3 1/3 0 1 0 P 1/3 0 2/3 1/3 1/3 1/3
1/3 1/3 1/3 1/3 2/3 1 1 2 3 P = ⇤ 010 1/302/3 1/3 1/3 1/3 ⇥ ⌅