time series basics fin250f: lecture 8.1 spring 2010 reading: brooks, chapter 5.1-5.7
Post on 17-Jan-2016
214 Views
Preview:
TRANSCRIPT
Time Series Basics
Fin250f: Lecture 8.1
Spring 2010
Reading: Brooks, chapter 5.1-5.7
Outline
Linear stochastic processes Autoregressive process Moving average process Lag operator Model identification
PACF/ACF Information Criteria
Stochastic Processes
Yt(y1, y2 , y3,K yT )
E(Yt | yt−1,yt−2 ,K ) =E(Yt |t−1)
Time Series Definitions
Strictly stationaryCovariance stationaryUncorrelated White noise
Strictly Stationary
All distributional features are independent of time
F(yt , yt−1,K yt−m)E(Yt |yt−1,…,yt−m) independent of time
Weak or Covariance Stationary
Variances and covariances independent of time
E(yt ) =μE(yt −μ)(yt −μ)=σ 2 <∞E(yt −μ)(yt+ j −μ)=γ j
Autocorrelation
White Noise
E(yt ) =μE(yt −μ)(yt −μ)=σ 2 <∞E(yt −μ)(yt+ j −μ)=0 j > 0
White Noise in Words
Weakly stationaryAll autocovariances are zeroNot necessarily independent
Time Series Estimates
γ̂ j =1
T − j(
t=1
T−j
∑ yt −μ)(yt+ j −μ)
τ̂ j =γ̂ j
γ̂0
White noise:τ̂ j ~N(0,1 /T )
Ljung-Box Statistic
Q* =T(T + 2)τ̂ k
2
T −kk=1
m
∑Q* ~χm
2
Linear Stochastic Processes
Linear modelsTime series dependenceCommon econometric frameworksEngineering background
Autoregressive Process, Order 1:AR(1)
AR(1) Properties
E(yt ) =μ +φE(yt−1) =μ +φE(yt)
E(yt) =μ
1−φEt(yt+1) =μ +φyt
More AR(1) Propertiesμ =0
yt = φyt−1 + utE(yt
2 ) = E(φyt−1 + ut )(φyt−1 + ut )
E(yt2 ) = E(φ2yt−1
2 ) + 2E(utφyt−1) +σ u2
E(yt2 ) = E(φ2yt
2 ) +σ u2
E(yt2 ) =
σ u2
(1−φ2 )= var(yt )
More AR(1) propertiesμ =0
yt = φyt−1 + utE(ytyt−1) = E(φyt−1 + ut )(yt−1)
γ 1 = E(ytyt−1) = φσ y2
τ 1 =E(ytyt−1)
σ y2 = φ
τ j = φ j
AR(1): Zero mean form
yt+1 =μ +φyt +ut+1
E(yt+1) =μ
1−φ=Ú
(yt+1 −Ú)=φ(yt −Ú)+ut+1
AR(m) (Order m)
yt =μ + φjyt−jj=1
m
∑ +ut
Moving Average Process of Order 1, MA(1)
yt =μ +θut−1 +ut
MA(1) Properties
yt =μ +θut−1 +ut
E(yt) =μEt(yt+1) =μ +θut
E(yt −μ)2 =E(θut−1 +ut)(θut−1 +ut)
var(yt) =(1+θ 2 )σu2
γ1 =E(yt −μ)(yt−1 −μ)=E(θut−1 +ut)(θut−2 +ut−1) =θσu2
τ1 =cor(yt,yt−1) =θ
(1+θ 2 )τ j =0 j ≥2
MA(m)
yt =μ + θ jut−j +utj=1
q
∑var(yt) =(1+θ1
2 +…+θq2 )σu
2
cov(yt,yt−j ) =(θ j +θ j+1θ1 +…+θqθq−j )σu2
cov(yt,yt−j ) =0 j > q
Stationarity
Process not explodingFor AR(1)All finite MA's are stationaryMore complex beyond AR(1)
|φ|<1
AR(1)->MA(infinity)yt =φyt−1 +ut
yt−1 =φyt−2 +ut−1
yt =φ(φyt−2 +ut−1) +ut
yt =φ2yt−2 +φut−1 +ut
yt =φmyt−m+ φ jut−j
j=0
m
∑ , |φ |<1
yt = φ jut−jj=0
∞
∑
Lag Operator (L)
Lyt =yt−1
Lkyt =yt−k
Lkμ =μ
Using the Lag Operator (Mean adjusted form)
yt −ε =φ(yt−1 −ε)+ut
yt −ε =φL(yt −ε)+ut
(1−φL)(yt −ε) =ut
An important feature for Lyt =φyt−1 +ut
(1−φL)yt =ut
yt =1
(1−φL)ut
yt = φ jLjutj=0
∞
∑ = (φL) j utj=0
∞
∑1
(1−φL)= (φL) j
j=0
∞
∑
MA(1) -> AR(infinity)
yt =μ +θut−1 +ut
yt −μ =(1+θL)ut
1(1+θL)
(yt −μ)=ut
(−θL) j (yt −μ)j=0
∞
∑ =ut
MA->AR
yt −μ = −(−θ) j
j=1
∞
∑ (yt−j −μ )+ut
yt −μ = (−1) j−1θ j
j=1
∞
∑ (yt−j −μ )+ut
|θ |<1 "Invertibility"
AR's and MA's
Can convert any stationary AR to an infinite MA
Exponentially declining weightsCan only convert "invertible" MA's to
AR'sStationarity and invertibility:
Easy for AR(1), MA(1) More difficult for larger models
Combining AR and MA ARMA(p,q) (more later)
yt =μ + φiyt−ii=1
p
∑ + θ jut−jj=1
q
∑ +ut
Modeling ProceduresBox/Jenkins
Identification Determine structure
How many lags? AR, MA, ARMA?
Tricky Estimation
Estimate the parameters Residual diagnostics Next section: Forecast performance and
evaluation
Identification Tools
Diagnostics ACF, Partial ACF Information criteria Forecast
Autocorrelationγ̂ j =
1T − j
(t=1
T−j
∑ yt −μ )(yt+ j −μ )
τ̂ j =γ̂ j
γ̂0
White noise:τ̂ j ~N(0,1 /T )
95% bands
[-1.96 (1 /T ),1.96 (1 /T )]
Partial Autocorrelation
Correlation between y(t) and y(t-k) after removing all smaller (<k) correlations
Marginal forecast impact from t-k given all earlier information
Partial Autocorrelation
yt =μ +β1,1yt−1 +ut
yt =μ +β1,2yt−1 +β2,2yt−2 +ut
yt =μ +β1,3yt−1 +β2,3yt−2 +β3,3yt−3 +ut
pacf =[β1,1,β2,2 ,β3,3,…]
For an AR(1)
yt =μ +φyt−1 +ut
ACF( j) =τ j =φj
PACF(1) =φPACF(>1) =0
AR(1) (0.9)
For an MA(1)yt =μ +θut−1 +ut
ACF(1) =τ1 =θ
1+θ 2
ACF(>1) =0PACF =AR(∞)
yt = (−1) j−1θ j
j=1
∞
∑ (yt−j ) +ut
PACF =(θ,−θ 2 ,θ 3,−θ 4 ,…)
MA(1) (0.9)
General Features
Autoregressive Decaying ACF PACF drops to zero beyond model order(p)
Moving average Decaying PACF ACF drops to zero beyond model order(q)
Don’t count on things looking so good
Information Criteria
Akaike, AICSchwarz Bayesian criterion, SBICHannan-Quinn, HQICObjective:
Penalize model errors Penalize model complexity Simple/accurate models
Information Criteria
k=number of parameters
AIC =log(σ̂ 2 ) +2kT
SBIC =log(σ̂ 2 ) +kT
log(T )
HQIC =log(σ̂ 2 ) +2kT
log(log(T ))
Estimation
Autoregressive AR OLS Biased(-), but consistent, and approaches
normal distribution for large TMoving average MA and ARMA
Numerical estimation procedures Built into many packages
Matlab econometrics toolbox
Residual Diagnostics
Get model residuals (forecast errors)Run this time series through various
diagnostics ACF, PACF, Ljung/Box, plots
Should be white noise (no structure)
top related