The expectation of Yt is given by E(Yt)=limE(p+y0t+y1t-1+y2t-2+……+gret-r) 一 The variance of Yt is E(Y1-p)2 lim E(oEt +P1Et-1+92Et-2+..+PTEt-T im(y+92+2+…+1)o Fori>0 (Yt-p)(Y-) (9390+9+191+y+292+9+393+….)2 k=0 Thus, E(Y and %j are both finite and independent of t. The MA(oo) process with absolute-summable coefficients is weakly-stationary 1.3.3 Check Ergodicity Proposition The absolute summability of the moving average coefficients implies that the pro- Proof: Recall the autocovariance of an M A(o)is
The expectation of Yt is given by E(Yt) = lim T→∞ E(µ + ϕ0εt + ϕ1εt−1 + ϕ2εt−2 + .... + ϕT εt−T ) = µ The variance of Yt is γ0 = E(Yt − µ) 2 = lim T→∞ E(ϕ0εt + ϕ1εt−1 + ϕ2εt−2 + .... + ϕT εt−T ) 2 = lim T→∞ (ϕ 2 0 + ϕ 2 1 + ϕ 2 2 + .... + ϕ 2 T )σ 2 . For j > 0, γj = E(Yt − µ)(Yt−jµ) = (ϕjϕ0 + ϕj+1ϕ1 + ϕj+2ϕ2 + ϕj+3ϕ3 + ....)σ 2 = σ 2X∞ k=0 ϕj+kϕk. Thus, E(Yt) and γj are both finite and independent of t. The MA(∞) process with absolute-summable coefficients is weakly-stationary. 1.3.3 Check Ergodicity Proposition: The absolute summability of the moving average coefficients implies that the process is ergodic. Proof: Recall the autocovariance of an MA(∞) is γj = σ 2X∞ k=0 ϕj+kϕk. 6
=0k=0 But there exists an M oo such that Zi=lpi l< M, and therefore >i=o lpi+*l< M for k=0. 1. 2,..., meaning that h|<a2∑|pkM<m2Mf2 Hence, the MA(oo) process with absolute-summable coefficients is ergodic
Then |γj | = σ 2 X∞ k=0 ϕj+kϕk ≤ σ 2X∞ k=0 |ϕj+kϕk| , and X∞ j=0 |γj | ≤ σ 2X∞ j=0 X∞ k=0 |ϕj+kϕk| = σ 2X∞ j=0 X∞ k=0 |ϕj+k||ϕk| = σ 2X∞ k=0 |ϕk| X∞ j=0 |ϕj+k|. But there exists an M < ∞ such that P∞ j=0 |ϕj | < M, and therefore P∞ j=0 |ϕj+k| < M for k = 0, 1, 2, ..., meaning that X∞ j=0 |γj | < σ 2X∞ k=0 |ϕk|M < σ 2M2 < ∞. Hence, the MA(∞) process with absolute-summable coefficients is ergodic. 7
2 Autoregressive Process 2.1 The First-Order Autoregressive Process A stochastic process Yt, t E T is said to be a first order autoregressive process(AR(1)) if it can be expressed in the form Yt=c+ort where c and o are constants and Et is a white- noise process 2.1.1 Check Stationarity and Ergodicity Write the AR(1) process is lag operator form OLY+Et (1-OLY=c+Et In the case lo 1, we know from the properties of lag operator in last chapter (1-L)-1=1+oL+φ2L thi Y=(c+e)·(1+oL+φ2L2+…) (c+oLc+2L2c+…)+(et+oLet+2L2et+… (c+e+2c+…)+(et+cet-1+2et-2+…) +Et+ Et-1+Et-2+ 1 This can be viewed as an MA(oo)process with p; given by o. When I<1 this AR() is an MA(oo) with absolute summable coefficient =∑|P 1-1 < Therefore, the AR(1) process is stationary and ergodic provided that o <1
2 Autoregressive Process 2.1 The First-Order Autoregressive Process A stochastic process {Yt , t ∈ T } is said to be a first order autoregressive process (AR(1)) if it can be expressed in the form Yt = c + φYt−1 + εt , where c and φ are constants and εt is a white-noise process. 2.1.1 Check Stationarity and Ergodicity Write the AR(1) process is lag operator form: Yt = c + φLYt + εt , then (1 − φL)Yt = c + εt . In the case |φ| < 1, we know from the properties of lag operator in last chapter that (1 − φL) −1 = 1 + φL + φ 2L 2 + ...., thus Yt = (c + εt) · (1 + φL + φ 2L 2 + ....) = (c + φLc + φ 2L 2 c + ...) + (εt + φLεt + φ 2L 2 εt + ...) = (c + φc + φ 2 c + ...) + (εt + φεt−1 + φ 2 εt−2 + ...) = c 1 − φ + εt + φεt−1 + φ 2 εt−2 + ... This can be viewed as an MA(∞) process with ϕj given by φ j . When φ| < 1, this AR(1) is an MA(∞) with absolute summable coefficient: X∞ j=0 |ϕj | = X∞ j=0 |φ| j = 1 1 − |φ| < ∞. Therefore, the AR(1) process is stationary and ergodic provided that |φ| < 1. 8
2.1.2 The Dependence Structure The expectation of Yt is given by E(Y) = E( C et+et-1+a2et-2+… 1- The variance of Yt is E(Y-F E(et+et-1+a2et-2+…)2 (1+2+a4+…,lg2 1-φ For,>0 =E(Y-p)(Y--p) BI +2=-y++et-1-1++2t-y-2+…) (et-j+et--1+2et-j-2+…,) (1+02+4+… It follows that the autocorrelation function which follows a pattern of geometric decay as the plot on p 50 2.1.3 An Alternative Way to Calculate the Moments of a Stationary AR(1)PI Assume that the AR(1) process under consideration is weakly-stationary, then taking expectation on both side we have E(Y=c+oE(Y-1)+E(Et)
2.1.2 The Dependence Structure The expectation of Yt is given by E(Yt) = E( c 1 − φ + εt + φ 1 εt−1 + φ 2 εt−2 + ...) = c 1 − φ = µ. The variance of Yt is γ0 = E(Yt − µ) 2 = E(εt + φ 1 εt−1 + φ 2 εt−2 + ....) 2 = (1 + φ 2 + φ 4 + ....)σ 2 = 1 1 − φ 2 σ 2 . For j > 0, γj = E(Yt − µ)(Yt−j − µ) = E(εt + φ 1 εt−1 + φ 2 εt−2 + .... + φ j εt−j + φ j+1εt−j−1 + φ j+2εt−j−2 + ....) × (εt−j + φ 1 εt−j−1 + φ 2 εt−j−2 + ....) = (φ j + φ j+2φ j+4 + ...)σ 2 = φ j (1 + φ 2 + φ 4 + ....)σ 2 = φ j 1 − φ2 σ 2 . It follows that the autocorrelation function rj = γj γ0 = φ j , which follows a pattern of geometric decay as the plot on p.50. 2.1.3 An Alternative Way to Calculate the Moments of a Stationary AR(1) Process Assume that the AR(1) process under consideration is weakly-stationary, then taking expectation on both side we have E(Yt) = c + φE(Yt−1) + E(εt). 9
Since by assumption that the process is stationary E(Y)=E(Y-1)=1 herefore =c++0 reproducing the earlier result To find a higher moments of Yt in an analogous manner, we rewrite this AR(1) Yt=1(1-)+Yt-1+Et (Yt-u=o(Yi-1-F)+Et For j20, multiply (Yi-i-u)on both side of (3)and take expectation E E(Y-1-1)(Y=1-1)+E(Y-1-1)et mj-1+ E(Y-,)Et Next we consider the term E(Yt-i-pEt When j=0, multiply Et on both side of (3) and take expectation E(Y1-p)t=E[0(Y1-1-)E+E(e2) Recall that Yt-1-u is a linear function of Et-1, Et-2, 62 E[0(Y1-1-1)2]=0
Since by assumption that the process is stationary, E(Yt) = E(Yt−1) = µ. Therefore, µ = c + φµ + 0 or µ = c 1 − φ , reproducing the earlier result. To find a higher moments of Yt in an analogous manner, we rewrite this AR(1) as Yt = µ(1 − φ) + φYt−1 + εt or (Yt − µ) = φ(Yt−1 − µ) + εt . (3) For j ≥ 0, multiply (Yt−j − µ) on both side of (3) and take expectation: γj = E[(Yt − µ)(Yt−j − µ)] = φE[(Yt−1 − µ)(Yt−j − µ)] + E(Yt−j − µ)εt = φγj−1 + E(Yt−j − µ)εt . Next we consider the term E(Yt−j − µ)εt . When j = 0, multiply εt on both side of (3) and take expectation: E(Yt − µ)εt = E[φ(Yt−1 − µ)εt ] + E(ε 2 t ). Recall that Yt−1 − µ is a linear function of εt−1, εt−2, ... : Yt−1 − µ = εt−1 + φεt−2 + φ 2 εt−3 + ..... we have E[φ(Yt−1 − µ)εt ] = 0. 10