the number of parameters increase with the size of the subset(t1, . tr) although the parameters do not depend on t E T. This is because time-homogeneity does not restrict the 'memory'of the process. In the next section we are going to consider 'memory'restrictions in an obvious attempt to ' solve' the problem of the parameters increasing with the size of the subset(t1, t2, . tr)of T 1.3 Restricting the memory of a stochastic process n the case of a typical economic times series, viewed as a particular realiza- tion of a stochastic process ( Xt, tET one would expect that the dependence between X, and X,, would tend to weaken as the distance(ti-ti)increase Formally, this dependence can be described in terms of the joint distribution F(Xn)=F(Xt=.= F(Xtr) as follows Definition asymptotically independent Definitio asymptotically uncorrelated Definition trongly mixing Definition uniformly mixing Definition ergodic 1. 4 Some special stochastic process We will consider briefly several special stochastic process which play an impor tant role in econometric modeling. These stochastic processes will be divided into parametric and non-parametric process. The non-parametric process are de- fined in terms of their joint distribution function or the first few joint moments
the number of parameters increase with the size of the subset (t1, ...,tT ) although the parameters do not depend on t ∈ T . This is because time-homogeneity does not restrict the ’memory’ of the process. In the next section we are going to consider ’memory’ restrictions in an obvious attempt to ’solve’ the problem of the parameters increasing with the size of the subset (t1,t2, ...,tT ) of T . 1.3 Restricting the memory of a stochastic process In the case of a typical economic times series, viewed as a particular realization of a stochastic process {Xt , t ∈ T } one would expect that the dependence between Xti and Xtj would tend to weaken as the distance (tj − ti) increase. Formally, this dependence can be described in terms of the joint distribution F(Xt1 ) = F(Xt2 ) = ... = F(XtT ) as follows: Definition: asymptotically independent Definition: asymptotically uncorrelated Definition: strongly mixing Definition: uniformly mixing Definition: ergodic. 1.4 Some special stochastic process We will consider briefly several special stochastic process which play an important role in econometric modeling. These stochastic processes will be divided into parametric and non-parametric process. The non-parametric process are de- fined in terms of their joint distribution function or the first few joint moments. 6
On the other hand, parametric process are defined in terms of a generating mech anism which is commonly a functional form based on a non-parametric process 1.4.1 Non-Parametric process Definition A stochastic process Xt, tET is said to be a white-noise process if (i). E(Xt) (ii E(X,X) g if t=T 0ift≠T Hence, a white-noise process is both time-homogeneous, in view of the fact that it is a weakly-stationary process, and has no memory. In the case where Xt, tETI is also assumed to be normal the process is also strictly stationary Definition A stochastic process Xt, tET is said to be a martingales process if Definition A stochastic process (Xt, tET is said to be an innovation process if Definition: A stochastic process ( Xt, tET is said to be a Markov process if Definition A stochastic process iXt, tET is said to be a Brownian motion process if 1.4.2 Parametric stochastic processes Definition A stochastic process (Xt, tET is said to be a autoregressive of order one (AR(I)) if it satisfies the stochastic difference equation Xt=xt-1+u where o is a constant and ut is a white-noise process
On the other hand, parametric process are defined in terms of a generating mechanism which is commonly a functional form based on a non-parametric process. 1.4.1 Non-Parametric process Definition: A stochastic process {Xt , t ∈ T } is said to be a white-noise process if (i). E(Xt) = 0; (ii). E(XtXτ ) = σ 2 if t = τ 0 if t 6= τ. Hence, a white-noise process is both time-homogeneous, in view of the fact that it is a weakly-stationary process, and has no memory. In the case where {Xt , t ∈ T } is also assumed to be normal the process is also strictly stationary. Definition: A stochastic process {Xt , t ∈ T } is said to be a martingales process if... Definition: A stochastic process {Xt , t ∈ T } is said to be an innovation process if.... Definition: A stochastic process {Xt , t ∈ T } is said to be a Markov process if.... Definition: A stochastic process {Xt , t ∈ T } is said to be a Brownian motion process if... 1.4.2 Parametric stochastic processes Definition: A stochastic process {Xt , t ∈ T } is said to be a autoregressive of order one (AR(1)) if it satisfies the stochastic difference equation, Xt = φXt−1 + ut where φ is a constant and ut is a white-noise process. 7
We first consider the index set t*={0,±1,±2,…} and assume that X-r→0 asT→∞. Define a lag- operator by LXt≡Xt-1, then the ar(1)process can be rewritten as (1-oLXt ut or when o|< 1, X=(1-6D)-m=(1+oL+2L2+…)ut=u+out-1+2au1-2+ ∑ from which we can deduce that E(XL) E(X, Xi+r)=E p'o Hence, for lo< 1, the stochastic process Xt, tET) is both weakly-stationary and asymptotically uncorrelated since the autocovariance function (r) o→0,asr→∞ Therefore, if any finite subset of T*, say t1, t2, . tr of a ar(1)process, (Xt, Xt,, XtT)= Xr has covariance matrix E(XTXT=O (1-2
We first consider the index set T ∗ = {0, ±1, ±2, ...} and assume that X−T → 0 as T → ∞. Define a lag − operatorL by LXt ≡ Xt−1, then the AR(1) process can be rewritten as (1 − φL)Xt = ut or when |φ| < 1, Xt = (1 − φL) −1ut = (1 + φL + φ 2L 2 + ....)ut = ut + φut−1 + φ 2ut−2 + ..... = X∞ i=0 φ iut−i , from which we can deduce that E(Xt) = 0, E(XtXt+τ ) = E ( X∞ i=0 φ iut−i ! X∞ j=0 φ iut+τ−j !) = σ 2 u X∞ i=0 φ iφ i+τ ! = σ 2 uφ τ X∞ i=0 φ 2i ! , τ ≥ 0. Hence, for |φ| < 1, the stochastic process {Xt , t ∈ T ∗} is both weakly-stationary and asymptotically uncorrelated since the autocovariance function v(τ ) = σ 2 u (1 − φ 2 ) φ τ → 0, as τ → ∞. Therefore, if any finite subset of T ∗ , say t1,t2, ...,tT of a AR(1) process, (Xt1 , Xt2 , ..., XtT ) ≡ X0 T has covariance matrix E(XTX0 T) = σ 2 u 1 (1 − φ 2 ) 1 φ . . . φ T −1 φ 1 φ . . φ T −2 . . . . . . . . . . . . . . . . . . φ T −1 . . . . 1 = σ 2 uΩ, where Ω = 1 (1 − φ 2 ) 1 φ . . . φ T −1 φ 1 φ . . φ T −2 . . . . . . . . . . . . . . . . . . φ T −1 . . . . 1 . 8