Introduction to Nonparametric Analysis in Time Series Econometrics Yongmiao Hong 2020 1
Introduction to Nonparametric Analysis in Time Series Econometrics Yongmiao Hong 2020 1
This is Chapter 6 of a manuscript entitled as Modern Time Series Analysis:Theory and Applications written by the author.We will introduce some popular nonparametric methods,particularly the kernel smoothing method and the local polynomial smoothing method,to estimate functions of interest in time series contexts,such as probability density functions,autoregression functions,spectral density functions,and generalized spectral density functions.Empirical applications of these functions crucially depend on the consistent estimation of these functions.We will discuss the large sample statistical properties of nonparametric estimators in various contexts. Key words:Asymptotic normality,bias,boundary problem,consistency,curse of di- mensionality,density function,generalized spectral density,global smoothing,integrated mean squared error,law of large numbers,local polynomial smoothing,local smoothing, locally stationary time series model,mean squared error,kernel method,regression func- tion,series approximation,smoothing,spectral density function,Taylor series expansion, variance. Reading Materials and References This lecture note is self-contained.However,the following references will be useful for learning nonparametric analysis. (1)Nonparametric Analysis in Time Domain Silverman,B.(1986):Nonparametric Density Estimation and Data Analysis.Chap- man and Hall:London. Hardle,W.(1990):Applied Nonparametric Regression.Cambridge University Press:Cambridge. Fan,J.and Q.Yao (2003),Nonlinear Time Series:Parametric and Nonparametric Methods,Springer:New York. (2)Nonparametric Methods in Frequency Domain Priestley,M.(1981),Spectral Analysis and Time Series.Academic Press:New York. .Hannan,E.(1970),Multiple Time Series,John Wiley:New York 2
This is Chapter 6 of a manuscript entitled as Modern Time Series Analysis: Theory and Applications written by the author. We will introduce some popular nonparametric methods, particularly the kernel smoothing method and the local polynomial smoothing method, to estimate functions of interest in time series contexts, such as probability density functions, autoregression functions, spectral density functions, and generalized spectral density functions. Empirical applications of these functions crucially depend on the consistent estimation of these functions. We will discuss the large sample statistical properties of nonparametric estimators in various contexts. Key words: Asymptotic normality, bias, boundary problem, consistency, curse of dimensionality, density function, generalized spectral density, global smoothing, integrated mean squared error, law of large numbers, local polynomial smoothing, local smoothing, locally stationary time series model, mean squared error, kernel method, regression function, series approximation, smoothing, spectral density function, Taylor series expansion, variance. Reading Materials and References This lecture note is self-contained. However, the following references will be useful for learning nonparametric analysis. (1) Nonparametric Analysis in Time Domain Silverman, B. (1986): Nonparametric Density Estimation and Data Analysis. Chapman and Hall: London. H‰rdle, W. (1990): Applied Nonparametric Regression. Cambridge University Press: Cambridge. Fan, J. and Q. Yao (2003), Nonlinear Time Series: Parametric and Nonparametric Methods, Springer: New York. (2) Nonparametric Methods in Frequency Domain Priestley, M. (1981), Spectral Analysis and Time Series. Academic Press: New York. Hannan, E. (1970), Multiple Time Series, John Wiley: New York. 2
1 Motivation Suppose IXt}is a strictly stationary process with marginal probability density func- tion g()and pairwise joint probability density function fi(,y),and a random sample X of size T is observed.Then, .How to estimate the marginal pdf g(r)of [X:? .How to estimate the pairwise joint pdf fi(r,y)of (Xt,X)? How to estimate the autoregression function rj()=E(XX-j=z)? How to estimate the spectral density h(w)of [X}? .How to estimate the generalized spectral density f(w,u,v)of [X)? .How to estimate the bispectral density b(w1,w2)? How to estimate a nonlinear autoregressive conditional heteroskedastic model Xi=u(X:-1,...,Xi-p)+o(X:-1,...,Xi-q)Et,e}~i.i.d.(0,1) where u()and o()are unknown functions of the past information.Under certain regularity conditions,u()is the conditional mean of Xt given I1=[X-1,X-2,...} and o2()is the conditional variance of Xt given It-1. How to estimate a semi-nonparametric functional coefficient autoregressive process X=∑agX-X-+ E(el-1)=0a.s, =1 where ai()is unknown,and d>0 is a time lag parameter? How to estimate a nonparametric additive autoregressive process Xi= ∑,(X-)+et, E(et It-1)=0 a.s., j=1 where the ()functions are unknown? How to estimate a locally linear time-varying regression model Yi=XiB(t/T)+Et, where B(.)is an unknown smooth deterministic function of time?
1 Motivation Suppose fXtg is a strictly stationary process with marginal probability density function g(x) and pairwise joint probability density function fj (x; y); and a random sample fXtg T t=1 of size T is observed. Then, How to estimate the marginal pdf g(x) of fXtg? How to estimate the pairwise joint pdf fj (x; y) of (Xt ; Xtj )? How to estimate the autoregression function rj (x) = E(Xt jXtj = x)? How to estimate the spectral density h(!) of fXtg? How to estimate the generalized spectral density f(!; u; v) of fXtg? How to estimate the bispectral density b(!1; !2)? How to estimate a nonlinear autoregressive conditional heteroskedastic model Xt = (Xt1; :::; Xtp) + (Xt1; :::; Xtq)"t ; f"tg i:i:d:(0; 1); where () and () are unknown functions of the past information. Under certain regularity conditions, () is the conditional mean of Xt given It1 = fXt1; Xt2; :::g and 2 () is the conditional variance of Xt given It1. How to estimate a semi-nonparametric functional coe¢ cient autoregressive process Xt = X p j=1 j (Xtd)Xtj + "t ; E("t jIt1) = 0 a.s., where j () is unknown, and d > 0 is a time lag parameter? How to estimate a nonparametric additive autoregressive process Xt = X p j=1 j (Xtj ) + "t ; E("t jIt1) = 0 a.s., where the j () functions are unknown? How to estimate a locally linear time-varying regression model Yt = X 0 t(t=T) + "t ; where () is an unknown smooth deterministic function of time? 3
How to use these estimators in economic and financial applications? Nonparametric estimation is often called nonparametric smoothing,since a key parameter called smoothing parameter is used to control the degree of the estimated curve.Nonparametric smoothing first arose from spectral density estimation in time series analysis.In a discussion of the seminal paper by Bartlett (1946),Henry Daniels suggested that a possible improvement on spectral density estimation could be made by smoothing the periodogram(see Chapter 3),which is the squared discrete Fourier transform of the random sample {X.The theory and techniques were then system- atically developed by Bartlett (1948,1950).Thus,smoothing techniques were already prominently featured in time series analysis more than 70 years ago In the earlier stage of nonlinear time series analysis(see Tong(1990)),the focus was on various nonlinear parametric forms,such as threshold autoregressive models,smooth transition autoregressive models,and Regime-switch Markov chain autoregressive mod- els(see Chapter 8 for details).Recent interest has been mainly in nonparametric curve estimation,which does not require the knowledge of the functional form beyond certain smoothness conditions on the underlying function of interest. Question:Why is nonparametric smoothing popular in statistics and econometrics? There are several reasons for the popularity of nonparametric analysis.In particular, three main reasons are: Demands for nonlinear approaches; Availability of large data sets; Advance in computer technology. Indeed,as Granger (1999)points out,the speed in computing technology increases much faster than the speed at which data grows. To obtain basic ideas about nonparametric smoothing methods,we now consider two examples,one is the estimation of a regression function,and the other is the estimation of a probability density function. ¥
How to use these estimators in economic and Önancial applications? Nonparametric estimation is often called nonparametric smoothing, since a key parameter called smoothing parameter is used to control the degree of the estimated curve. Nonparametric smoothing Örst arose from spectral density estimation in time series analysis. In a discussion of the seminal paper by Bartlett (1946), Henry Daniels suggested that a possible improvement on spectral density estimation could be made by smoothing the periodogram (see Chapter 3), which is the squared discrete Fourier transform of the random sample fXtg T t=1. The theory and techniques were then systematically developed by Bartlett (1948,1950). Thus, smoothing techniques were already prominently featured in time series analysis more than 70 years ago. In the earlier stage of nonlinear time series analysis (see Tong (1990)), the focus was on various nonlinear parametric forms, such as threshold autoregressive models, smooth transition autoregressive models, and Regime-switch Markov chain autoregressive models (see Chapter 8 for details). Recent interest has been mainly in nonparametric curve estimation, which does not require the knowledge of the functional form beyond certain smoothness conditions on the underlying function of interest. Question: Why is nonparametric smoothing popular in statistics and econometrics? There are several reasons for the popularity of nonparametric analysis. In particular, three main reasons are: Demands for nonlinear approaches; Availability of large data sets; Advance in computer technology. Indeed, as Granger (1999) points out, the speed in computing technology increases much faster than the speed at which data grows. To obtain basic ideas about nonparametric smoothing methods, we now consider two examples, one is the estimation of a regression function, and the other is the estimation of a probability density function. 4
Example 1 Regression Function:Consider the first order autoregression function r1(x)=E(X:Xi-1=x). We can write Xt=ri(Xi-1)+t, where E(etX:-1)=0 by construction.We assume E(X?)<oo. Suppose a sequence of bases (r)}constitutes a complete orthonormal basis for the space of square-integrable functions.Then we can always decompose the function where the Fourier coefficient rn(e),(e), which is the projection of ri(r)on the base i(). Suppose there is a quadratic function ri(z)-x2forx∈【-元,x.Then r1(x)= 2 34 cos(ar)-o s(2y)+os(3d- 32 π2 4∑(-1p-1os0四 j=1 For another example,suppose the regression function is a step function,namely -1if-π<x<0, r(x) 0 if =0, 1if0<x<π. Then we can still expand it as an infinite sum of periodic series, n()= 4 sin(e)sin)sin( 3 5 4 户m2j+1网 (2j+1) 5
Example 1 [Regression Function]: Consider the Örst order autoregression function r1(x) = E(Xt jXt1 = x): We can write Xt = r1(Xt1) + "t ; where E("t jXt1) = 0 by construction. We assume E(X2 t ) < 1: Suppose a sequence of bases f j (x)g constitutes a complete orthonormal basis for the space of square-integrable functions. Then we can always decompose the function r1(x) = X1 j=0 j j (x); where the Fourier coe¢ cient j = Z 1 1 r1(x) j (x)dx; which is the projection of r1(x) on the base j (x): Suppose there is a quadratic function r1(x) = x 2 for x 2 [; ]: Then r1(x) = 2 3 4 cos(x) cos(2x) 2 2 + cos(3x) 3 2 = 2 3 4 X1 j=1 (1)j1 cos(jx) j 2 : For another example, suppose the regression function is a step function, namely r1(x) = 8 >>< >>: 1 if < x < 0; 0 if x = 0; 1 if 0 < x < : Then we can still expand it as an inÖnite sum of periodic series, r1(x) = 4 sin(x) + sin(3x) 3 + sin(5x) 5 + = 4 X1 j=0 sin[(2j + 1)x] (2j + 1) : 5