tssol11

Upload: aset999

Post on 14-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/30/2019 tssol11

    1/5

    Time Series Solutions MTH 6139

    1. (a) The lag 1 differencing operator is defined by

    Xt = Xt Xt1 = (1 B)Xt

    and the lag d differencing operator d by

    dXt = Xt Xtd = (1 Bd)Xt.

    Let mt = 0 + 1t + . . . + ktk. Then, since kmt = k!k, the differenced series

    kXt will have no trend. Similarly, since dst = st std = 0, the differenced seriesdXt will have no seasonality. By combining these two operations, both the trend andseasonality can be removed from the time series {Xt}. [6](b) When k = 1, we may write

    dXt = dmt + dst + dYt= 0 + 1t 0 1(t d) + dYt

    = 1d + dYt.

    For stationarity, we need to check that the expectation and variance are constant, andthat the covariances do not depend on t. Clearly, E(dXt) = 1d, which does notdepend on t. We also have

    cov(dXt, dXt+) = cov(dYt, dYt+)

    = 2Y() Y( d) Y( + d).

    Since this does not depend on t, the differenced series is stationary. [5](c) The first step of the classical decomposition method involves estimating the trendusing a moving average filter of period length d. The seasonal effects are estimated inthe next step by computing the averages of the detrended values and adjusting themso that the seasonal effects meet the model assumptions. Using the estimated seasonaleffects, the time series {Xt} is then deseasonalised and the trend re-estimated using thedeseasonalised variables. In the final step, the residuals are calculated as the detrendedand deseasonalised variables. [6]

    1

  • 7/30/2019 tssol11

    2/5

    Time Series Solutions MTH 6139

    2. (a) First note that E(Zt) = 0 for all t, and so E(Xt) = 0 for all t. Thus, we may write

    cov(Xt, Xt+) = E{(Zt + 1Zt1 + 2Zt2)(Zt+ + 1Zt+1 + 2Zt+2)}

    = E(ZtZt+ + 1ZtZt+1 + 2ZtZt+2 + 1Zt1Zt+

    +21Zt1Zt+1 + 12Zt1Zt+2 + 2Zt2Zt+

    +12Zt2Zt+1 + 22Zt2Zt+2).

    Since the Zt are uncorrelated random variables with var(Zt) = 2, we obtain

    () =

    (1 + 21 + 22)

    2 if = 0,1(1 + 2)

    2 if = 1,2

    2 if = 2,0 if || > 2.

    It follows that

    () =

    1 if = 0,1(1+2)1+2

    1+2

    2

    if = 1,2

    1+21+2

    2

    if = 2,

    0 if || > 2.

    For an MA(q) process, () is not necessarily zero when || q and () = 0 when|| > q. [12](b) The above MA(2) process is invertible if and only if

    (z) = 1 + 1z+ 2z2

    = 0

    only for |z| > 1. Solving this quadratic equation yields

    z =1

    21 42

    22.

    The process is invertible for those values of 1 and 2 which satisfy |z| > 1. [6](c) The seasonal MA(2)h process is defined as

    Xt = Zt + 1Zth + 2Zt2h.

    It is invertible if and only if

    (zh) = 1 + 1zh + 2z

    2h = 0

    only for |z| > 1. [4]

    2

  • 7/30/2019 tssol11

    3/5

    Time Series Solutions MTH 6139

    3. (a) In the operator form, the process is

    (B)Xt = (B)Zt,

    where (z) = 1 1z 2z2

    and (z) = 1 + z. It would be an ARMA(2, 1) process ifthe polynomials (z) and (z) have no common factors. The condition for this is that1 + 1/ 2/

    2 = 0. [4](b) The linear process form of the time series is

    Xt =

    j=0

    jZtj = (B)Zt,

    where

    (B) =

    j=0

    jBj.

    Thus, in terms of polynomials in z, we may write

    (1 1z 2z2)(0 + 1z+ 2z

    2 + . . .) = 1 + z.

    Equating the coefficients of zj, j = 0, 1, . . ., we obtain 0 = 1, 1 = 1 + and

    j = 1j1 + 2j2

    for j 2. When 1 = 0.3, 2 = 0.4 and = 0.9, the roots of (z) are z1 = 2 andz2 = 5/4. So the general solution to this second-order difference equation is

    j = c1zj1 + c2z

    j2 ,

    where c1 and c2 can be obtained from the initial conditions. In this case, the initialconditions yield the equationsc1 + c2 = 1

    and

    1

    2c1 +

    4

    5c2 = 1.2.

    These give c1 = 4/13 and c2 = 17/13. Thus, we have

    j = 4

    13(2)j +

    17

    13

    5

    4

    j

    for j 2. [15]

    (c) The difference equations in terms of the autocorrelation function for an ARMA(2, 1)process are given by

    () 1( 1) 2( 2) = 0

    for 2 with initial conditions

    () 1( 1) 2( 2) =2

    (0)(0 + +11)

    for 0 < 2, where 0 = 1, 1 = and = 0 if > 1. The autocorrelation functiontails off for this process. [5]

    3

  • 7/30/2019 tssol11

    4/5

    Time Series Solutions MTH 6139

    4. (a) There are three cases to consider: (i) || < 1; (ii) || = 1; and (iii) || > 1. Incases (i) and (iii), an AR(1) process is stationary, since Xt can be expressed as a linearcombination of the Zt. It is only causal in case (i), because Xt depends on future

    values of Zt in case (iii). In case (ii), an AR(1) process reduces to a random walk,which is neither stationary nor causal. However, the first difference of a random walkis stationary, since it is just white noise. [6](b) The difference equation in terms of the autocorrelation function for a causal AR(1)process is

    () ( 1) = 0

    for 1 with initial condition (0) = 1. Thus, we can write

    () = ( 1) = 2( 2) = . . . = (0) =

    for = 0, 1, . . .. Since () = () for all , the autocorrelation function is

    () = ||

    for = 0, 1, 2, . . .. The partial autocorrelation function is 11 = and = 0 for > 1. [8](c) The best linear predictor of Xn+1 based on X1, . . . , X n is

    X(n)n+1 = Xn.

    The Yule-Walker estimators are

    = (1)

    and2 = (0)

    1 2(1)

    ,

    where (1) is the sample autocorrelation at lag 1 and (0) is the sample variance. Sothe estimated predictor is

    X(n)n+1 = (1)Xn

    andX

    (n)n+1 1.96

    is an approximate 95% prediction interval. [6]

    4

  • 7/30/2019 tssol11

    5/5

    Time Series Solutions MTH 6139

    5. (a) The sample ACF of yt = xt cuts off after lag 1 and the sample PACF tails offwith the same sign. This suggests that yt can be modelled as an MA(1) process witha negative value of the moving average parameter. Hence, we have p = 0, q = 1, and,

    since the xt were differenced once, d = 1. This means that the time series belongs tothe ARIMA(0, 1, 1) class. [8](b) The suggested model for Xt is

    Xt = Zt + Zt1,

    which can be written as(1 B)Xt = (1 + B)Zt,

    where {Zt} W N(0, 2) and < 0. [3]

    (c) The residuals ACF, PACF, normal plot and histogram can be used to assess

    whether the residuals behave like a Gaussian white noise process. If this is the case,the sample autocorrelations at lag = 0 will be approximately zero and lie within theapproximate 95% confidence bounds, the normal plot will show a straight line and thehistogram will be approximately bell-shaped. The Ljung-Box-Pierce Q statistic canalso be used to test whether groups of autocorrelations are zero. If the null hypothesisis rejected, then there are some correlations in the data that are unaccounted for. [6]

    5