Ch. 20 Processes with Deterministic Trends 1 Traditional Asymptotic Results of OlS Suppose a linear regression model with stochastic regressor given by Y=x!3+e,t=1,2,…,T,;B∈R or in matrix form y=xB+E We are interested in the asymptotic properties such as consistency and limiting distribution of the OLS estimator of B; B=(X'X)X'y as T=oo,under imple traditional assumptions 1.1 Independent Identically Distributed Observation 1.1.1 Consistency To prove consistency of B, we use Kolmogorov's laws of large number of Ch Rewrite B-B=(XX)XE we have the following result Theorem In addition to(1), suppose that (1).I(x4, Et))(+1)x1 is an i.i.d. sequences; (2) (a)E(x∈t)=0
Ch. 20 Processes with Deterministic Trends 1 Traditional Asymptotic Results of OLS Suppose a linear regression model with stochastic regressor given by Yt = x 0 tβ + εt , t = 1, 2, ..., T, ; β ∈ R k , (1) or in matrix form: y = Xβ + ε. We are interested in the asymptotic properties such as consistency and limiting distribution of the OLS estimator of β; βˆ = (X0X) −1X0y as T → ∞, under simple traditional assumptions. 1.1 Independent Identically Distributed Observation 1.1.1 Consistency To prove consistency of βˆ, we use Kolmogorov’s laws of large number of Ch 4. Rewrite βˆ − β = (X0X) −1X0 ε = X0X T −1 X0ε T = PT t=1 xtx 0 t T !−1 PT t=1 xtεt T ! , we have the following result. Theorem: In addition to (1), suppose that (1). {(x 0 t , εt) 0}(k+1)×1 is an i.i.d. sequences; (2). (a) E(xtεt) = 0; 1
(b) EXTeL<∞,i=1,2,…,k; (a)EX|2<∞,i=1,2,…k; (b)M=E(xtxt) is positive definite Then→. R 1. Assumption(2a)is talking about of the mean of this i i.d. sequences(Xtet, i= 1, 2,.,k), see Proposition 3.3 of White, 2001, p. 32) and(2b)is about its first moment exist 2. Assumption(3a) guarantee its(Xti Xti) first moment exist by Cauchy-Schwarz inequality and (3b)is talking about of the mean of this i i d (Xti Xti, i=1, 2, k;j= 1,2,…,k) sequence. An existence of the first moment is what is need for Lln of i.i. d. sequence See p. 15 of Ch 4 It is obvious that from these assumptions we have XtEt E and →E∠t=1Xtx M Therefore 月一B
(b) E|Xtiεt | < ∞, i = 1, 2, ..., k; (3). (a) E|Xti| 2 < ∞, i = 1, 2, ..., k; (b) M ≡ E(xtx 0 t ) is positive definite; Then βˆ a.s −→ β. Remark: 1. Assumption (2a) is talking about of the mean of this i.i.d. sequences (Xtiεt , i = 1, 2, ..., k), see Proposition 3.3 of White, 2001, p.32) and (2b) is about its first moment exist. 2. Assumption (3a) guarantee its (XtiXtj ) first moment exist by Cauchy-Schwarz inequality and (3b) is talking about of the mean of this i.i.d. (XtiXtj , i = 1, 2, .., k; j = 1, 2, ..., k) sequence. An existence of the first moment is what is need for LLN of i.i.d. sequence. See p.15 of Ch.4. Proof: It is obvious that from these assumptions we have X0ε T = PT t=1 xtεt T ! a.s −→ E PT t=1 xtεt T ! = 0 and X0X T = PT t=1 xtx 0 t T ! a.s −→ E PT t=1 xtx 0 t T ! = M. (2) Therefore βˆ − β a.s −→ M−10 = 0, or βˆ a.s −→ β. 2
1.1.2 Asymptotic Normality To prove asymptotic normality of B, we use Koln LLn and lindeber Levy's central limit theorem of Ch 4. Rewrite VT(B-B) we have the following result Theorem In addition to(1), suppose (i).I(x/, Et)) is an i i.d. sequences (a)e(xtEt)=0 (b)E|XxnP2<∞,i=1,2,…,k; (c)Vr=Var(T-1/X'e)=V is positive definite (a)M=E(xtxt)is positive definite (b)EXP2<∞,t=1,2,…k Then D-1/VT(B-B)N(O, I), where D=M-IVM-1 Remark: 1. Assumption(ii. a) is talking about of the mean of this ii d sequences (XtiEt, i 1, 2,.,k),(ii. b) is about its second moment exist which is needed for the appli cation of Lindeberg - Levy's central limit theorem(see p. 22 of Ch. 4)and (ii. c)is to standardize the random vector T-1/2(XE)so that the asymptotic distribution is unit multivariate normal 2. Assumption (iii a)is talking about of the mean of this i i.d.(Xti Xti, i 1 2, .,k)sequence and(iii b) guarantee its first moment exist by Cauchy-Schwarz inequality. An existence of the first moment is what is need for LLn of i.i. d. sequence. See p. 15 of Ch 4 Proof:
1.1.2 Asymptotic Normality To prove asymptotic normality of βˆ, we use Kolmogorov’s LLN and LindebergLe´vy’s central limit theorem of Ch 4. Rewrite √ T(βˆ − β) = X0X T −1 √ T X0ε T = PT t=1 xtx 0 t T !−1 √ T PT t=1 xtεt T ! , we have the following result. Theorem: In addition to (1), suppose (i). {(x 0 t , εt) 0} is an i.i.d. sequences; (ii). (a) E(xtεt) = 0; (b) E|Xtiεt | 2 < ∞, i = 1, 2, ..., k; (c) VT ≡ V ar(T −1/2X0ε) = V is positive definite; (iii). (a) M ≡ E(xtx 0 t ) is positive definite; (b) E|Xti| 2 < ∞, i = 1, 2, ..., k; Then D−1/2 √ T(βˆ − β) L−→ N(0, I), where D ≡ M−1VM−1 . Remark: 1. Assumption (ii.a) is talking about of the mean of this i.i.d. sequences (Xtiεt , i = 1, 2, ..., k), (ii.b) is about its second moment exist which is needed for the application of Lindeberg-Le´vy’s central limit theorem (see p.22 of Ch. 4) and (ii.c) is to standardize the random vector T −1/2 (X0ε) so that the asymptotic distribution is unit multivariate normal. 2. Assumption (iii.a) is talking about of the mean of this i.i.d. (XtiXtj , i = 1, 2, .., k; j = 1, 2, ..., k) sequence and (iii.b) guarantee its first moment exist by Cauchy-Schwarz inequality. An existence of the first moment is what is need for LLN of i.i.d. sequence. See p.15 of Ch.4. Proof: 3
It is obvious that from these assumptions we have =T-lXE-N(O, Var(T-/XE=N(o, v) and t=1 Therefore VT(-B)M-1.N(0,V) N(0,M-VM-1), (MVM-)-12√⑦(B-B)2N(0,I 1.2 Independent Heterogeneously Distributed Observa- tion 1.2.1 Consistency To prove consistency of B, we use revised Markov laws of large number of Ch 4 B-B=(X'X)-XE ∑1xEt we have the following result Theorem In addition to(1), suppose
It is obvious that from these assumptions we have √ T X0ε T = T −1/2X0 ε L−→ N(0, V ar(T −1/2X0 ε) ≡ N(0, V) and X0X T = PT t=1 xtx 0 t T ! a.s −→ E PT t=1 xtx 0 t T ! = M. Therefore √ T(βˆ − β) L−→ M−1 · N(0, V) ≡ N(0,M−1VM−1 ), or (M−1VM−1 ) −1/2 √ T(βˆ − β) L−→ N(0, I). 1.2 Independent Heterogeneously Distributed Observation 1.2.1 Consistency To prove consistency of βˆ, we use revised Markov laws of large number of Ch 4. Rewrite βˆ − β = (X0X) −1X0 ε = X0X T −1 X0ε T = PT t=1 xtx 0 t T !−1 PT t=1 xtεt T ! , we have the following result. Theorem: In addition to (1), suppose 4
(i.I(x, Et)' is an independent sequences (a) e(x, Et)=0; (b)E|Xe+6<△<∞, for some d>0,i=1,2,…,k (a) MT=E(X'X/T) is positive definite (b)EX2+<△<∞, for some>0 Remark 1. Assumption (ii. a)is talking about of the mean of this independent sequences (XtEt, i=1, 2, ., k)and (ii. b)is about its(1+8) moment exist 2. Assumption (iii a)is talking about of the limits of almost sure convergence of AT and (ii. b) guarantee its(1+8)moment exist of (Xti Xtj, i= 1, 2,,h;j= 1, 2, ...,k) by Cauchy-Schwarz inequality An existence of the(1+o) moment is what is need for LLN of independent sequence. See p. 15 of Ch 4 It is obvious that from these assumptions we have XtE XtEt 0 and E Therefore
(i). {(x 0 t , εt) 0} is an independent sequences; (ii). (a) E(xtεt) = 0; (b) E|Xtiεt | 1+δ < ∆ < ∞, for some δ > 0, i = 1, 2, ..., k; (iii). (a) MT ≡ E(X0X/T) is positive definite; (b) E|X2 ti| 1+δ < ∆ < ∞, for some δ > 0, i = 1, 2, ..., k; Then βˆ a.s −→ β. Remark: 1. Assumption (ii.a) is talking about of the mean of this independent sequences (Xtiεt , i = 1, 2, ..., k) and (ii.b) is about its (1 + δ) moment exist. 2. Assumption (iii.a) is talking about of the limits of almost sure convergence of X0X T and (iii.b) guarantee its (1 + δ) moment exist of (XtiXtj , i = 1, 2, .., k; j = 1, 2, ..., k) by Cauchy-Schwarz inequality. An existence of the (1 + δ) moment is what is need for LLN of independent sequence. See p.15 of Ch.4. Proof: It is obvious that from these assumptions we have X0ε T = PT t=1 xtεt T ! a.s −→ E PT t=1 xtεt T ! = 0 and X0X T = PT t=1 xtx 0 t T ! a.s −→ E PT t=1 xtx 0 t T ! = MT. Therefore βˆ − β a.s −→ M−1 T 0 = 0, or βˆ a.s −→ β. 5