Ch. 14 Stationary ARMA Process a general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARMA process, which provide a very useful class of models for de- scribing the dynamics of an individual time series. Throughout this chapter we assume the time index T to be T= ...-2,, 0, 1, 2,. 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process Yt, t ET is said to be a first order moving average process(MA(1) if it can be expressed in the form Yt=u+Et+BEt where u and 8 are constants and Et is a white-noise process Remember that a white noise process Et, tET) is that E(Et=0 and E(Etes) when t=s hent≠ 1.1.1 Check Stationarity The expectation of Yt is given by E(Y)=E(u+Et+OEt-1=u+E(Et)+0E(Et-1=u, for all tE T The variance of Yt E(Y-p)2=E(et+0e-1)2 E(e2+20c:et-1+62=21 =a2+0+02σ (1+02
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARMA process, which provide a very useful class of models for describing the dynamics of an individual time series. Throughout this chapter we assume the time index T to be T = {... − 2, −1, 0, 1, 2, ...}, 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be a first order moving average process (MA(1)) if it can be expressed in the form Yt = µ + εt + θεt−1, where µ and θ are constants and εt is a white-noise process. Remember that a white noise process {εt , t ∈ T } is that E(εt) = 0 and E(εtεs) = σ 2 when t = s 0 when t 6= s . 1.1.1 Check Stationarity The expectation of Yt is given by E(Yt) = E(µ + εt + θεt−1) = µ + E(εt) + θE(εt−1) = µ, for all t ∈ T . The variance of Yt is E(Yt − µ) 2 = E(εt + θεt−1) 2 = E(ε 2 t + 2θεtεt−1 + θ 2 ε 2 t−1 ) = σ 2 + 0 + θ 2 σ 2 = (1 + θ 2 )σ 2 . 1
The first autocovariance is E(Y-p(Yt-1-u)= E(Et+BEt-1(Et-1+8Et-2) E(EtEt-1+0E_1+0EtEt-2+62Et-1Et-2 0+602+0+0 Higher autocovariances are all zero D=E(Y-u)(Yi-i-u=E(Et+0Et-1(Et-i+8Et-j-1)=0 forj>1 Since the mean and the autocovariances are not functions of time, an MA(1) process is weakly-stationary regardless of the value of 6 1.1.2 Check Ergodicity It is clear that the condition ∑h=(1+62)+|21< is satisfied. Thus the MA(1) process is ergodic. 1.1.3 The Dependence Structure The jth autocorrelation of a weakly-stationary process is defined as its jth autocovariance divided by the variance By Cauchy-Schwarz inequality, we have Ir,l s 1 for all From above results, the autocorrelation of an MA(1) process is whenj=0 when 3 The autocorrelation r, can be plotted as a function of 3. This plot is usually called autocogram See the plots of p
The first autocovariance is E(Yt − µ)(Yt−1 − µ) = E(εt + θεt−1)(εt−1 + θεt−2) = E(εtεt−1 + θε 2 t−1 + θεtεt−2 + θ 2 εt−1εt−2) = 0 + θσ 2 + 0 + 0 = θσ 2 . Higher autocovariances are all zero: γj = E(Yt − µ)(Yt−j − µ) = E(εt + θεt−1)(εt−j + θεt−j−1) = 0 for j > 1. Since the mean and the autocovariances are not functions of time, an MA(1) process is weakly-stationary regardless of the value of θ. 1.1.2 Check Ergodicity It is clear that the condition X∞ j=0 |γj | = (1 + θ 2 ) + |θσ 2 | < ∞ is satisfied. Thus the MA(1) process is ergodic. 1.1.3 The Dependence Structure The jth autocorrelation of a weakly-stationary process is defined as its jth autocovariance divided by the variance rj = γj γ0 . By Cauchy-Schwarz inequality, we have |rj | ≤ 1 for all j. From above results, the autocorrelation of an MA(1) process is rj = 1 when j = 0 θσ 2 (1+θ 2)σ2 = θ (1+θ 2) when j = 1 0 when j > 1 . The autocorrelation rj can be plotted as a function of j. This plot is usually called autocogram. See the plots of p.50. 2
1.2 The q-th Order Moving Average Process A stochastic process Yt, tE TI is said to be a moving average process of order q(MA(q) if it can be expressed in this form Y=p+et+61et-1+62et-2+…+6=t-q where A, 01, 02,. 0a are constants and Et is a white-noise process 1.2.1 Check Stationarity The expectation of Yt is given by E(Y)=E(H+et+61et-1+62=-2+…+6=t-q) A+E(Et)+01E(Et-1)+B2E(Et-2)++0gE(Et-g)=u, for alltET The variance of Yt is 0=E(Y-p)2=E(t+01=-1+02=1-2+…+04=t-9)2 Since Ets are uncorrelated, the variance is 70=02+6m2+B2+…+2=(1+6++…+)2 Forj=1,2,…,q, =E[(Yt-1)(Y--p E[(et+61et-1+62t-2+…+6t-9)×(et-+61t-1-1+62et-1-2+…+64=t--9) E0=2-1+b+(=2=1-1+61+22=2-2+…+20-ye2- Terms involving Es at different dates have been dropped because their product has expectation zero, and bo is defined to be unity. For i>g, there are no as with common dates in the definition of %j, and so the expectation is zero. Thus, 3+6+161+63+22+…+b,04-2forj=1,2,…,q 0>9
1.2 The q-th Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be a moving average process of order q (MA(q)) if it can be expressed in this form Yt = µ + εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q, where µ, θ1, θ2, ..., θq are constants and εt is a white-noise process. 1.2.1 Check Stationarity The expectation of Yt is given by E(Yt) = E(µ + εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q) = µ + E(εt) + θ1E(εt−1) + θ2E(εt−2) + ... + θqE(εt−q) = µ, for all t ∈ T . The variance of Yt is γ0 = E(Yt − µ) 2 = E(εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q) 2 . Since εt ’s are uncorrelated, the variance is γ0 = σ 2 + θ 2 1σ 2 + θ 2 2σ 2 + .... + θ 2 qσ 2 = (1 + θ 2 1 + θ 2 2 + ... + θ 2 q )σ 2 . For j = 1, 2, ..., q, γj = E[(Yt − µ)(Yt−j − µ)] = E[(εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q) × (εt−j + θ1εt−j−1 + θ2εt−j−2 + ... + θqεt−j−q)] = E[θjε 2 t−j + θj+1θ1ε 2 t−j−1 + θj+2θ2ε 2 t−j−2 + .... + θqθq−jε 2 t−q ]. Terms involving ε’s at different dates have been dropped because their product has expectation zero, and θ0 is defined to be unity. For j > q, there are no ε’s with common dates in the definition of γj , and so the expectation is zero. Thus, γj = [θj + θj+1θ1 + θj+2θ2 + .... + θqθq−j ]σ 2 for j = 1, 2, ..., q 0 for j > q . 3
For example, for an MA(2) process 1+61+2] 1+B261]a2 0 For any value of (01, 82,,0g), the MA(q) process is thus weakly-stationary 1.2.2 Check Ergodicity It is clear that the condition ∑hl<∞ satisfied. Thus the MA(g process is ergodic 1.2.3 The Dependence Structure The autocorrelation function is zero after q lags. See the plots of p 50 1.3 The Infinite-Order Moving Average Process A stochastic process Yt, tET) is said to be an infinite-order moving average process (MA(oo)) if it can be expressed in this form Y=+∑9=1=1+901+91-1+2-2+ are constants with o=l and Et is a white-noise process 1.3.1 Is This a Well Defined Random Sequence? A sequence pi lio is said to be square-summable if ∑y<∞
For example, for an MA(2) process, γ0 = [1 + θ 2 1 + θ 2 2 ]σ 2 γ1 = [θ1 + θ2θ1]σ 2 γ2 = [θ2]σ 2 γ3 = γ4 = .... = 0 For any value of (θ1, θ2, ..., θq), the MA(q) process is thus weakly-stationary. 1.2.2 Check Ergodicity It is clear that the condition X∞ j=0 |γj | < ∞ is satisfied. Thus the MA(q) process is ergodic. 1.2.3 The Dependence Structure The autocorrelation function is zero after q lags. See the plots of p.50. 1.3 The Infinite-Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be an infinite-order moving average process (MA(∞)) if it can be expressed in this form Yt = µ + X∞ j=0 ϕjεt−j = µ + ϕ0εt + ϕ1εt−1 + ϕ2εt−2 + ..... where µ, ϕ0, ϕ1, ϕ2, ..., are constants with ϕ0 = 1 and εt is a white-noise process. 1.3.1 Is This a Well Defined Random Sequence? A sequence {ϕj} ∞ j=0 is said to be square-summable if X∞ j=0 ϕ 2 j < ∞, 4
whereas a sequence i lg-o is said to be absolute-summable if pil< It is important to note that absolute summability implies square-summability, but the converse does not hold Proposition If the coefficients of the M A(oo)is square-summable, then iso pjiEt-iconverges in mean square to some random variable Yt as T-o0 Proof The Cauchy criterion states that 2io; converges in mean square to some random variable Yt as T-o if and only if, for any s>0, there exists a suitably arge N such that for any integer M>N jeT In words, once N terms have been summed, the difference between that sum and the one obtained from summing to m is a random variable whose mean and variance are both arbitrarily close to zero Now the left hand side of (1)is simply EIPMEt-M+PM-lEt-M+1+.+PN+1Et-N-1 But if 2i=oP; oo, then by the Cauchy criterion the right side of (2)may be made as small as desired by a suitable large N. Thus the M A(oo)is well defined sequence since the infinity series 2i=0; converges in mean squares 1.3.2 Check Stationarity Assume the MA(oo) process to be with absolutely summable coefficients
whereas a sequence {ϕj} ∞ j=0 is said to be absolute-summable if X∞ j=0 |ϕj | < ∞. It is important to note that absolute summability implies square-summability, but the converse does not hold. Proposition: If the coefficients of the MA(∞) is square-summable, then P∞ j=0 ϕjεt−j converges in mean square to some random variable Yt as T → ∞. Proof: The Cauchy criterion states that P∞ j=0 ϕjεt−j converges in mean square to some random variable Yt as T → ∞ if and only if, for any ς > 0, there exists a suitably large N such that for any integer M > N E "X M j=0 ϕjεt−j − X N j=0 ϕjεt−j #2 < ς. (1) In words, once N terms have been summed, the difference between that sum and the one obtained from summing to M is a random variable whose mean and variance are both arbitrarily close to zero. Now the left hand side of (1) is simply E [ϕMεt−M + ϕM−1εt−M+1 + .... + ϕN+1εt−N−1] 2 = (ϕ 2 M + ϕ 2 M−1 + ... + ϕ 2 N+1)σ 2 = "X M j=0 ϕ 2 j − X N j=0 ϕ 2 j #2 σ 2 . (2) But if P∞ j=0 ϕ 2 j < ∞, then by the Cauchy criterion the right side of (2) may be made as small as desired by a suitable large N. Thus the MA(∞) is well defined sequence since the infinity series P∞ j=0 ϕjεt−j converges in mean squares. 1.3.2 Check Stationarity Assume the MA(∞) process to be with absolutely summable coefficients. 5