week 6 quantitative analysis of financial markets modeling ...francis x. diebold, elements of...
TRANSCRIPT
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Week 6Quantitative Analysis of Financial Markets
Modeling Cycles: MA, AR, and ARMA Models
Christopher Ting
Christopher Ting
http://www.mysmu.edu/faculty/christophert/
k: [email protected]: 6828 0364
ÿ: LKCSB 5036
November 15, 2017
Christopher Ting QF 603 November 15, 2017 1/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Lesson Plan
1 Introduction
2 Moving Average (MA) Models
3 Autoregressive (AR) Models
4 ARMA Models
5 Estimations
6 Takeaways
Christopher Ting QF 603 November 15, 2017 2/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Introduction
A Be aware that we’re approximating a more complex reality bymodels.
A The key to successful time series modeling and forecasting isparsimonious, yet accurate, approximation of the Woldrepresentation.
A Three approximations of linear time series: moving average (MA)models, autoregressive (AR) models, and autoregressive movingaverage (ARMA) models.
A Each of these models are characterized by the autocorrelationfunctions and related quantities, under the assumption that themodel is “true.”
A We use sample autocorrelations and partial autocorrelations, inconjunction with the AIC and the SIC, to suggest candidateforecasting models.
Christopher Ting QF 603 November 15, 2017 3/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
QA-13 Modeling Cycles: MA, AR, and ARMAModels
Chapter 8.Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: CengageLearning, 2006).
A Describe the properties of the first-order moving average (MA(1))process, and distinguish between autoregressive representationand moving average representation.
A Describe the properties of a general finite-order process of order q(MA(q)) process.
A Describe the properties of the first-order autoregressive (AR(1))process, and define and explain the Yule-Walker equation.
Christopher Ting QF 603 November 15, 2017 4/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
QA-13 Modeling Cycles: MA, AR, and ARMAModels (cont’d)
A Describe the properties of a general p-th order autoregressive(AR(p)) process.
A Define and describe the properties of the autoregressive movingaverage (ARMA) process.
A Describe the application of AR and ARMA processes.
Christopher Ting QF 603 November 15, 2017 5/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
The MA(1) Process
B The first-order moving average, or MA(1), process is
yt = εt + θεt−1 = (1 + θL)εt, where εt ∼WN(0, σ2).
B The current value of the observed series is expressed as afunction of current and lagged unobservable shock.
B Think of it as a regression model with nothing but current andlagged disturbances on the right-hand side.
B The structure of the MA(1) process, in which only the first lag ofthe shock appears on the right, forces it to have a very shortmemory, and hence weak dynamics, regardless of the parametervalue
Christopher Ting QF 603 November 15, 2017 6/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Mean and Variance of MA(1)
B Unconditional mean
E(yt) = E(εt) + θE(εt−1) = 0.
B Conditional mean
E(yt|Ωt−1) = E(εt + θεt−1|Ωt−1) = θεt−1.
B Unconditional variance
V(yt) = V(εt) + θ2V(εt−1) = σ2 + θ2σ2 = σ2(1 + θ2).
B Conditional variance
V(yt|Ωt−1) = E((yt − E(yt|Ωt−1))2
∣∣Ωt−1) = E(ε2t∣∣Ωt−1) = σ2.
Christopher Ting QF 603 November 15, 2017 7/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Autocorrelation function
B Auto-covariance
γ(τ) = E(ytyt−τ
)= E
((εt + θεt−1)(εt−τ + θεt−τ−1)
)=
θσ2, τ = 1;
0, otherwise.
B Autocorrelation function
ρ(τ) =γ(τ)
γ(0)=
θ
1 + θ2, τ = 1;
0, otherwise.
B The key feature here is the sharp cutoff in the autocorrelations. Allautocorrelations are zero beyond displacement 1.
Christopher Ting QF 603 November 15, 2017 8/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Autoregressive Representation
B Write MA(1) as innovation
εt = yt − θεt−1.
B Lagging by successively more periods gives expressions for theinnovations at various dates,
εt−1 = yt−1 − θεt−2εt−2 = yt−2 − θεt−3
and so on.B Making use of these expressions for lagged innovations we can
substitute backward in the MA(1) process, yielding
yt = εt + θyt−1 − θ2yt−2 + θ3yt−3 − · · · .
Christopher Ting QF 603 November 15, 2017 9/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Infinite Autoregressive Representation
B Convergent autoregressive representation only exists, if, becausein the back substitution we raise 2 to progressively higher powers.
B Invertibility condition: the inverse of the root of the moving averagelag operator polynomial must be less than one in absolute value.
B MA(1) lag operator “polynomial”
1 + θL = 0
has one root, which is L = −1/θ.B MA(1) invertible representation
1
1 + θLyt = εt.
Christopher Ting QF 603 November 15, 2017 10/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Remarks
B The requirements of covariance stationarity (constantunconditional mean, constant and finite unconditional variance,autocorrelation depends only on displacement) are met for anyMA(1) process, regardless of the values of its parameters.
B After all, MA process is made of a linear combination of currentand past noise terms.
B For MA(1), if |θ| < 1, then we say that the MA(1) process isinvertible.
B Autoregressive representation: a current shock and laggedobservable values of the series on the right.
B Moving average representation: A current shock and laggedunobservable shocks on the right.
Christopher Ting QF 603 November 15, 2017 11/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Partial Autocorrelation Function or MA(1)
B Because of the autoregressive representation,
yt = εt + θyt−1 + θ2yt−2 + θ3yt−3 − . . . ,
the partial autocorrelation function decays gradually, anddescribed as damped oscillation.
B The larger |θ| is, the slower is the decay.
Christopher Ting QF 603 November 15, 2017 12/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
MA(q) Process
B The general finite-order moving average process of order q, is
yt = εt + θ1εt−1 + · · ·+ θqεt−q =: Θ(L)εt.
B Θ(L) = 1 + θ1L+ · · ·+ θqL1q is a q-th-order lag operator
polynomial.
B By allowing for more lags of the shock on the right side of theequation, the MA(q) process can capture richer dynamic patterns.
B The condition for invertibility of the MA(q) process is that theinverses of all of the roots must be inside the unit circle.
1
ΘLyt = εt.
Christopher Ting QF 603 November 15, 2017 13/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Properties of MA(q)
B The conditional mean of the MA(q) process evolves with theinformation set, in contrast to the unconditional moments, whichare fixed.
B In fact, the conditional mean depends on q lags of the innovation.=⇒ potential for longer memory.
B All autocorrelations beyond displacement q are zero. Thisautocorrelation cutoff is a distinctive property of moving averageprocesses.
B The partial autocorrelation function of the MA(q) process, incontrast, decays gradually, in accord with the infiniteautoregressive representation, in either an oscillating or one-sidedfashion, depending on the parameters of the process.
Christopher Ting QF 603 November 15, 2017 14/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
MA(q) and Wold Representation
B Recall the Wold representation yt = B(L)εt, where B(L) is ofinfinite order.
B The MA(q) process approximates the infinite moving average witha finite-order moving average,
yt = Θ(L)εt.
B By contrast, the MA(1) process approximates the infinite moveingaverage with only a first-order moving average, which can be veryrestrictive.
Christopher Ting QF 603 November 15, 2017 15/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
AR(1) Process
y The first-order autoregressive process, AR(1) for short, is
yt = φyt−1 + εt, εt ∼WN(0, σ2).
y In lag operator form, we write (1− φL)yt = εt.
y Substitute backward for lagged y’s on the right side, we obtain
yt = εt + φεt−1 + φ2εt−2 + · · · =1
1− φLεt.
y This moving average representation for y is convergent if and onlyif |φ| < 1.
y Equivalently, the condition for covariance stationarity is that theinverse of the root of the autoregressive lag operator polynomialbe less than one in absolute value.
Christopher Ting QF 603 November 15, 2017 16/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Mean-Variance Analysis of AR(1) Process
y Unconditional mean
E(yt) = E(εt) + φE(εt−1) + φ2 E(εt−2) + · · · = 0.
y Conditional mean
E(yt|yt−1) = E((φyy−1 + εt)|yt−1
)= φyt−1 + 0 = φyt−1.
y Unconditional variance
V(yt) = V(εt + φεt−1 + φ2εt−2 + · · ·
)= σ2 + φ2σ2 + φ4σ2 + · · · = σ2
∞∑i=0
φ2i =σ2
1− φ2.
y Conditional variance
V(yt|yt−1
)= V
((φyy−1+ εt)|yt−1
)= φ2V
(yt−1|yt−1
)+V(εt|yt−1) = σ2.
Christopher Ting QF 603 November 15, 2017 17/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Autocovariance and Autocorrelation of AR(1)
y Multiplying both sides of yt = φyt−1 + εt, we obtain
ytyt−τ = φyt−1yt−τ + εtyt−τ .
y For τ ≥ 1, taking expectations of both sides gives the Yule-Walkerequation:
γ(τ) = φγ(τ − 1).
y It is a recursive equation.
γ(0) =σ2
1− φ2=⇒ γ(1) = φγ(0) =⇒ γ(2) = φ2γ(0) =⇒ · · · = φτγ(0)
y Dividing through by γ(0) gives the autocorrelations,
ρ(τ) = φτ , τ = 0, 1, 2, 3, . . . .
Christopher Ting QF 603 November 15, 2017 18/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Partial Autocorrelation Function of AR(1)
y The partial autocorrelation function for the AR(1)
p(τ) =
φ, τ = 10, τ > 1.
y The partial autocorrelation function for the AR(1) process cuts offabruptly.
Christopher Ting QF 603 November 15, 2017 19/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
The AR(p) Process
y The general p-th order autoregressive process, or AR(p) for short,is
yt = φ1yt−1 + φ2yt−2 + · · ·+ φpyt−p + εt, εt ∼WN(0, σ2).
y In lag operator form we write
Φ(L)yt =(1− φ1L− φ2L2 − · · · − φpLp
)yt = εt
y An AR(p) process is covariance stationary if and only if theinverses of all roots of the autoregressive lag operator polynomialΦ(L) are inside the unit circle.
y A necessary (but not sufficient) condition for covariance
stationarity isp∑i=1
φi < 0.
Christopher Ting QF 603 November 15, 2017 20/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
AR(2)
y An AR(2) example
yt = 1.5yt−1 − 0.9yt−2 + εt
y The corresponding lag operator polynomial is 1− 1.5L+ 0.9L2
with two complex conjugate roots, 0.83± 0.65i.
y Class Activity: Prove that ρ(1) =φ1
1− φ2(Hint: ytyt−1 = φy2t−1 + φ2yt−2yt−1 + εtyt−1.)
y In general, for τ = 2, 3, . . .,
ρ(τ) = φ1ρ(τ − 1) + φ2ρ(τ − 2).
Christopher Ting QF 603 November 15, 2017 21/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
AR(p) and the Wold Representation
y The moving average representation associated with the AR(1)process is
yt =1
1− φLεt.
y The moving average representation associated with the AR(1)process is of infinite order, as is the Wold representation, but itdoes not have infinitely many free coefficients. In fact, only oneparameter, φ, underlies it.
y The moving average representation associated with the AR(p)process is
yt =1
Φ(L)εt.
Christopher Ting QF 603 November 15, 2017 22/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
ARMA(1,1)
n Autoregressive moving average process ARMA(1,1) is defined as
yt = φyt−1 + εt + θεt−1, εt ∼WN(0, σ2).
n In lag operator form
(1− φL)yt = (1 + θL)εt,
where |φ| < 1 is required for stationarity and θ < 1 is required forinvertibility.
Christopher Ting QF 603 November 15, 2017 23/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Representations of ARMA(1,1) Process
n If the covariance stationarity condition is satisfied, then we havethe moving average representation
yt =1 + θL
1− φLεt.
n If the invertibility condition is satisfied, then we have the infiniteautoregressive representation
1− φL1 + θL
yt = εt.
Christopher Ting QF 603 November 15, 2017 24/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
ARMA(p, q) Process
n Autoregressive moving average process ARMA(p, q) is defined as
yt = φyt−1+ · · ·+φpyt−p+εt+θεt−1+ · · ·+θqεt−q, εt ∼WN(0, σ2).
n In lag operator formΦ(L)yt = Θ(L)εt,
where
Φ(L) = 1− φ1L− φ2L2 − · · · − φpLp;Θ(L) = 1 + θ1L+ θ2L
2 + ·+ θqLq.
Christopher Ting QF 603 November 15, 2017 25/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Estimating MA(1) Model
o Proposition: An invertible moving average can be approximated asa finite-order auto-regression.
o Proof: Substitute yt = µ+ εt + θεt−1 backward m times to obtainthe autoregressive approximation
yt ≈µ
1 + θ+ θyt−1 − θ2yt−2 + · · ·+ (−1)m+1θmyt−m + εt.
o Estimation model
µ, θ = argminµ,θ
T∑t=1
[yt −
( µ
1 + θ+ θyt−1 − θ2yt−2 + · · ·+ (−1)m+1θmyt−m
)]2o The parameter estimates are to be found using numerical
optimization methods.
Christopher Ting QF 603 November 15, 2017 26/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Estimating AR(1) Model
o Autoregressions can be conveniently estimated by ordinary leastsquares regression as
yt = c+ φyt−1 + εt,
where c = µ(1− φ), and µ is the mean of yt.
o Least squares
c, φ = argminc,φ
T∑t=1
[yt − c− φyt−1
]2.
o The implied estimate of µ is u = c/(1− φ
).
Christopher Ting QF 603 November 15, 2017 27/28
Introduction Moving Average (MA) Models Autoregressive (AR) Models ARMA Models Estimations Takeaways
Takeaways
V Duality between MA and AR
V Unconditional mean and variance are constant.
V Conditional mean is changing; conditional variance is constant.
V Condition for covariance stationarity for AR
V Condition for invertibility
V Yule-Walker equation
V Autocorrelation function vs. partial autocorrelation function
Christopher Ting QF 603 November 15, 2017 28/28