periodic seasonal time series models with applications to u.s. … · 2 chapter 1. introduction...

156
Periodic Seasonal Time Series Models with applications to U.S. macroeconomic data

Upload: others

Post on 05-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Periodic Seasonal Time Series Modelswith applications to U.S. macroeconomic data

Page 2: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

ISBN 978 90 3610 246 9

Cover design: Crasborn Graphic Designers bno, Valkenburg a.d. Geul

This book is no. 503 of the Tinbergen Institute Research Series, established through

cooperation between Thela Thesis and the Tinbergen Institute. A list of books which

already appeared in the series can be found in the back.

Page 3: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

VRIJE UNIVERSITEIT

Periodic Seasonal Time Series Modelswith applications to U.S. macroeconomic data

ACADEMISCH PROEFSCHRIFT

ter verkrijging van de graad Doctor aan

de Vrije Universiteit Amsterdam,

op gezag van de rector magnificus

prof.dr. L.M. Bouter,

in het openbaar te verdedigen

ten overstaan van de promotiecommissie

van de faculteit der Economische Wetenschappen en Bedrijfskunde

op woensdag 7 september 2011 om 11.45 uur

in de aula van de universiteit,

De Boelelaan 1105

door

Anastasia Irma Widyanti Hindrayanto

geboren te Surabaya, Indonesie

Page 4: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

promotor: prof.dr. S.J. Koopman

copromotor: dr. M. Ooms

Page 5: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

To my parents

Page 6: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient
Page 7: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Contents

1 Introduction 1

2 Periodic unobserved components (PUC) models 5

2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.2 Unobserved components models . . . . . . . . . . . . . . . . . . . . . . 7

2.3 Periodic unobserved components models . . . . . . . . . . . . . . . . . 9

2.3.1 Periodic basic structural time series model . . . . . . . . . . . . 9

2.3.2 Convenient state space representation of periodic BSM . . . . . 11

2.3.3 Periodic stochastic cycle model . . . . . . . . . . . . . . . . . . 13

2.3.4 Simulation study on the periodic cycle component . . . . . . . 17

2.3.5 Periodic stochastic cycles and seasonal adjustment . . . . . . . 19

2.3.6 Parameter estimation and signal extraction . . . . . . . . . . . 22

2.4 Application to U.S. unemployment data . . . . . . . . . . . . . . . . . 23

2.4.1 Periodic UC model for U.S. unemployment series . . . . . . . . 26

2.4.2 Cycle variance moderation in U.S. unemployment series . . . . 30

2.4.3 Forecasting weights and forecasting performance . . . . . . . . 34

2.5 Summary and conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . 38

2.6 Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

3 Multivariate PUC models 45

3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

3.2 Multivariate periodic unobserved components models . . . . . . . . . . 48

3.2.1 Periodic specification . . . . . . . . . . . . . . . . . . . . . . . . 49

3.2.2 Estimation, testing and signal extraction . . . . . . . . . . . . . 51

3.3 Data description: U.S. employment series . . . . . . . . . . . . . . . . . 52

3.4 Empirical results for U.S. employment sectors . . . . . . . . . . . . . . 55

3.4.1 Model specification within the class of MPUC models . . . . . . 56

3.4.2 UC decomposition in the final model . . . . . . . . . . . . . . . 61

3.4.3 Residual diagnostics . . . . . . . . . . . . . . . . . . . . . . . . 66

3.5 Comparison with the Krane and Wascher study . . . . . . . . . . . . . 68

Page 8: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.6 Summary and conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . 73

3.7 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

4 Periodic SARIMA models 79

4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

4.2 Periodic SARIMA models in state space representation . . . . . . . . . 80

4.3 Initialisation of periodic SARIMA models . . . . . . . . . . . . . . . . . 83

4.4 Empirical illustration: U.S. unemployment . . . . . . . . . . . . . . . . 85

4.4.1 Data analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85

4.4.2 Periodic SARIMA model for U.S. unemployment series . . . . . 87

4.4.3 Periodic UC model for U.S. unemployment series . . . . . . . . 88

4.4.4 Estimation and forecast results . . . . . . . . . . . . . . . . . . 90

4.5 Summary and conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . 92

4.6 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94

5 Frequency-specific trigonometric seasonality 95

5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95

5.2 Frequency specific basic structural model . . . . . . . . . . . . . . . . . 97

5.3 Properties of FS-BSM and measuring distance to FS-AM . . . . . . . . 99

5.3.1 Stationary form of FS-BSM . . . . . . . . . . . . . . . . . . . . 100

5.3.2 Dynamic properties of a FS-BSM . . . . . . . . . . . . . . . . . 102

5.3.3 Distance between FS models using MA coefficients . . . . . . . 103

5.3.4 Monte-Carlo studies for frequency specific models . . . . . . . . 105

5.4 Empirical results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

5.4.1 Two examples comparing BSM and FS-BSM . . . . . . . . . . . 107

5.4.2 BSM vs FS-BSM in 75 time series . . . . . . . . . . . . . . . . . 113

5.4.3 How close are these (FS-) models to each other? . . . . . . . . . 113

5.4.4 Model fit comparison . . . . . . . . . . . . . . . . . . . . . . . . 119

5.5 Summary and conclusions . . . . . . . . . . . . . . . . . . . . . . . . . 119

5.6 Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121

6 Conclusions 129

Bibliography 131

Samenvatting (Summary in Dutch) 139

Acknowledgements 143

Page 9: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Chapter 1

Introduction

Many time series encountered in econometrics and in natural sciences display regular

seasonal fluctuations. In the analysis of such series, the seasonal variation can be di-

rectly incorporated in the model, or removed from the series by seasonal adjustment

methods which can implicitly or explicitly rely on seasonal models. This dissertation

describes and analyses several types of periodic seasonal time series models and develops

a new formulation which facilitates parameter estimation in the state space represen-

tation of the model.

In periodic time series analysis, a set of (usually annual) time series is simultaneously

analysed, and each individual series is exclusively related to a particular season. This

approach is in contrast with the more widely adopted approach of treating a series

as a single stochastic process with seasonal fluctuations (for example seasonal dummy

variable). Concentrating on a time series that is associated with a specific season,

such that it does not possess seasonal dynamics, may circumvent some of the intricate

aspects of modelling seasonal variations.

Periodic approaches have been investigated in the context of autoregressive moving

average (ARMA) models and dynamic econometric models, see for example the book of

Ghysels and Osborn (2001) or Franses and Paap (2004). Other researchers have inves-

tigated the periodic approach with various state space time series model formulations,

e.g., Krane and Wascher (1999), Koopman and Ooms (2002, 2006), Proietti (2004),

Bell (2004) and Penzer and Tripodis (2007).

In this thesis we further explore the periodic analysis of both autoregressive moving

average (ARMA) models and unobserved components (UC) time series models. The

ARMA model is well-known and is covered in many textbooks on time series. The UC

model decomposes a time series into components of interest including trend, seasonal,

cycle and irregular, see Harvey (1989), Durbin and Koopman (2001) and Commandeur

and Koopman (2007). We mainly focus on the analysis of seasonal macroeconomic time

series where we extend the UC models by having periodic coefficients that are associated

Page 10: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2 CHAPTER 1. INTRODUCTION

with the different components. Our contribution for periodic ARMA models lies in

a convenient state-space formulation of these models that do not require (seasonal)

differencing of the time series beforehand.

Throughout the thesis, we pay special attention to identification of the parameters

before we start the estimation procedure. Further we show that exact maximum likeli-

hood estimation is feasible despite the large number of parameters that are typically en-

countered in this class of periodic models. We also link periodic and non-periodic model

formulations by means of likelihood-ratio tests as these models are nested. Empirical

illustrations are given by using monthly U.S. unemployment time series for univariate

periodic models (Chapters 2 and 4) and quarterly U.S. employment time series for the

multivariate periodic models (Chapter 3).

A distinctive characteristic of periodic analysis is that we need to repeat a univariate

analysis for each season or to take a multivariate approach by modelling the S time

series simultaneously where S is the seasonal length. In this thesis, we advocate the

use of univariate representation with time-varying parameters for periodic models. We

demonstrate in Chapters 2 and 3 that this approach is identical to the multivariate

representation of periodic models with fixed parameters. The advantage of using the

univariate representation lies in the possibility of expanding the model for several time

series together, so that we have multivariate periodic models as explained in Chapter 4.

Chapter 5 of this thesis describes a special case of periodic UC model in which the

basic structural model (BSM) is generalised by allowing the time-varying trigonometric

terms associated with different seasonal frequencies to have different variances. We term

the resulting class of model the frequency specific basic structural model (FS-BSM). The

extended set of parameters in FS-BSM models can be estimated by standard maximum

likelihood procedures based on the Kalman filter. We explore the dynamic properties of

the FS-BSM and relate them to those of the frequency specific Airline model (FS-AM)

as described in Aston et al. (2007).

The periodicity in FS-BSM is imposed solely on the seasonal component so that

each seasonal frequency has its own variance. This particular model has been discussed

in detail by Harvey (1989) for the case of quarterly series. We extend the analysis of

this model for monthly series and compare the estimation performance of the FS-BSMs

with their FS-AMs counterpart. The relationships between coefficients in FS-AMs and

FS-BSMs are highly non-linear and we rely on numerical techniques to investigate the

connection between these models. We find empirically that both generalized models

have properties that are very close to each other and that FS-BSMs lead to similar

time series decompositions for a specific range of models in the FS-AMs. For this

investigation, we employ a U.S. Census Bureau database of seasonal time series that

has been analysed previously with the FS-AMs.

Page 11: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3

The remainder of this thesis contains four chapters. The highlights of each chapter

are as follows.

Chapter 2: Periodic unobserved component (PUC) models

This chapter introduces a general class of periodic unobserved components (PUC) time

series models with stochastic trend and seasonal components and with a novel periodic

stochastic cycle component. The general state space formulation of the periodic model

allows for exact maximum likelihood estimation, signal extraction and forecasting. Also

the consequences for model-based seasonal adjustment are discussed. The new periodic

model is applied to post-war monthly U.S. unemployment series, from which we identify

a significant periodic stochastic cycle. A detailed empirical periodic analysis is presented

including a comparison between the performances of periodic unobserved components

models and the non-periodic ones.

Chapter 3: Multivariate PUC models

This chapter analyses quarterly post-war U.S. non-farm employment for seven industrial

sectors based on a new multivariate periodic unobserved components time series model.

It includes sectoral trend, seasonal and irregular components with periodic features and

parameter adjustments for the period of the Great Moderation. The main feature of

our multivariate model is the common cycle component that enables the detection of

possibly periodic cyclical behaviour in the U.S. employment rate. The investigation of

periodicity in the cycle component of the time series is relevant since seasonal adjust-

ment procedures need to take account of periodic features when they are present. We

find that periodicity does exist in the sectoral trend and seasonal components but we do

not detect it in the common cycle component. We compare our findings with an earlier

empirical study on this subject. We investigate the robustness of our findings in detail

by considering univariate and multivariate variations of our model and by presenting

estimation results for sub-samples of our dataset.

Chapter 4: Periodic SARIMA models

In this chapter, state space formulations for periodic seasonal autoregressive integrated

moving average (periodic SARIMA) are considered. Convenient state space represen-

tations of the periodic SARIMA models are proposed to facilitate model identifica-

tion, specification and exact maximum likelihood estimation of the periodic param-

eters. These formulations do not require a-priori (seasonal) differencing of the time

series. The time-varying state space representation is an attractive alternative to the

Page 12: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4 CHAPTER 1. INTRODUCTION

time-invariant vector representation of periodic models which typically leads to a high

dimensional state vector in monthly periodic time series models. A key development is

our method of computing the variance-covariance matrix of the initial set of observa-

tions which is required for exact maximum likelihood estimation. We illustrate the use

of periodic SARIMA models to fit and forecast a monthly post-war U.S. unemployment

time series.

Chapter 5: Frequency-specific trigonometric seasonality

The basic structural time series model has been designed for the modelling and fore-

casting of seasonal time series. In this chapter we explore a generalisation of this model

in which the time-varying trigonometric terms associated with different seasonal fre-

quencies have different variances in their disturbances. The extended set of parameters

is estimated by maximum likelihood procedures based on the Kalman filter. The con-

tribution of this chapter is two-fold. The first is an elaborate description of the dynamic

properties of this frequency specific basic structural model. The second is the relation-

ship of this model to a comparable generalised version of the Airline model developed

at the U.S. Census Bureau. By adopting a quadratic distance metric based on the

restricted reduced form moving-average representation of the models, we conclude that

the generalised models have properties that are very close to each other compared to

their default counterparts. In some settings, the distance between the models is virtu-

ally zero so that the models can be regarded as observationally equivalent. An extensive

empirical study on disaggregated monthly shipment and foreign trade series illustrates

both the relations between the two classes of models, and the improvements obtained

by adopting the frequency-specific extensions.

Page 13: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Chapter 2

Periodic unobserved components

(PUC) models

This chapter is based on Koopman, Ooms, and Hindrayanto (2009).

2.1 Introduction

In this chapter we focus on periodic extensions of the univariate unobserved components

(UC) time series models that are reviewed in Harvey (1989). This set of linear models

describes a time series process as a sum of components that can be interpreted as trend,

season and cycle. Each component is specified as an independent linear dynamic pro-

cess. For example, the trend component can be associated with a random walk process

while the cycle component can be modelled as a stationary ARMA process. Additional

to lag polynomial coefficients, parameters consist of variances associated with distur-

bances that drive the random walk and ARMA processes. When these parameters are

allowed to be deterministic functions of the season index, the resulting UC models are

referred to as periodic unobserved components (PUC) models. Estimation procedures

for the PUC models are based on exact maximum likelihood using computationally

efficient state space methods.

Given the general concepts of both UC and periodic models, there are many ways

to specify a PUC model. If the PUC model is represented as a vector of independent

time series where each element represents a particular season, the seasonal component

is effectively eliminated. Therefore, the seasonal process can not be identified from an

observed time series when remaining components are periodically independent for all

seasons. Since we are particularly concerned with the decomposition of a time series

into trend, seasonal and cycle, we propose a convenient periodic formulation of the UC

model that preserves the ability to extract a seasonal component from a time series.

These PUC models can still be represented in the vector form of Gladysev (1961) and

Page 14: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

6 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

Tiao and Grupe (1980) but the components do not consist of periodically independent

processes since each component remains linearly dependent on a common underlying

stochastic process for all periods. We argue that the seasonal component can still

be identified in a PUC model and we show how to estimate it. The specification of

the periodic model remains linear and no modifications of the Kalman filter equations

are needed. Hence our approach facilitates the de-trending, seasonal adjustment and

trend-cycle decomposition of a time series based on a periodic model.

In the empirical illustration, we apply the periodic model to a long monthly time

series of postwar U.S. unemployment (January 1948 until December 2006). The un-

employment series is chosen as it is a key variable in economics. More importantly,

it is also subject to seasonal variation since labour supply and labour demand in im-

portant sectors of the economy depend on seasonal factors such as school calendars,

summer work, Easter/Christmas shopping, weather, winter breaks, etc. In economic

literatures, unemployment dynamics are often found to be periodic, both theoretically

and empirically, see for example Osborn and Smith (1989), Ooms and Franses (1997),

Krane and Wascher (1999), Matas-Mir and Osborn (2004), and Van Dijk et al. (2003).

Furthermore, it is well-known that unemployment data is highly dependent on business

cycle dynamics and therefore cycle components also need to be considered in the anal-

ysis. Most important for this chapter, it is argued that unemployment is also subject

to significant periodic serial correlation, in particular in the cyclical component. We

present a comprehensive treatment of a general class of PUC models that include trend,

seasonal and stationary cyclical components, and apply this to model the U.S. unem-

ployment dataset. We discuss and implement many aspects of time series analysis for

this class of PUC models, including seasonal adjustment, trend-cycle decomposition,

diagnostic checking of prediction errors and forecasting.

Periodic time series models have been introduced as early as 1955 in the article of

Hannan (1955) and found widespread interest in geophysics and environmental empir-

ical studies. Periodic dynamic regression models for economic time series have been

applied since the 1930’s. Mendershausen (1937) gave an early overview. Osborn and

Smith (1989) introduced the periodic time series framework in dynamic macroeconomic

models while Ghysels and Osborn (2001) and Franses and Paap (2004) discussed a wider

spectrum of periodic models and applications in econometrics. In the context of au-

toregressive models, Boswijk and Franses (1996) derived tests for periodic stochastic

non-stationarity and Burridge and Taylor (2004) developed simulation based seasonal

unit root tests in the presence of periodic heteroskedasticity. Exact maximum likeli-

hood estimation methods for periodic ARMA models have been discussed by Jimenez

et al. (1989) and Lund and Basawa (2000). Anderson and Meerschaert (2005) provided

asymptotic theory for efficient moment based estimation. Most of these studies have

Page 15: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.2. UNOBSERVED COMPONENTS MODELS 7

explored periodic versions of ARMA models.

Earlier periodic extensions of the UC models have been explored by Krane and

Wascher (1999), Koopman and Ooms (2002, 2006), Proietti (2004), Bell (2004) and

Penzer and Tripodis (2007). These authors have considered straightforward applications

of the vector representation of PUC models or they considered only specific parameters

to be periodic. More specifically, Proietti (2004) considered a UC model with the

trend component modelled as a weighted average of separate independent random walks

for each season, Penzer and Tripodis (2007) considered a seasonal component with a

periodic variance, while Koopman and Ooms (2002, 2006) explored different periodic

specifications of the UC model using standard available software. In the context of

ARMA based UC models, Krane and Wascher (1999) developed a multivariate periodic

UC model involving a common cyclical component with a seasonally varying effect on

quarterly U.S. employment growth. In the context of RegComponent UC models Bell

(2004) considered the effect of seasonal heteroskedasticity in the irregular component

on seasonal adjustment, see also Findley (2005). Different types of periodicity in a UC

model imply different optimal seasonal adjustment filters. It is therefore interesting

to investigate how we can identify different types of periodicity both in theory and in

practice, also for seasonal adjustment.

The remaining part of the chapter is organised as follows. The next section re-

views the UC model. The third section introduces PUC models and in particular the

novel periodic version of the stochastic cycle model. We also discuss possible impli-

cations for seasonal adjustment. The fourth section presents an analysis of monthly

U.S. unemployment series using both periodic and non-periodic UC models where we

reveal significant periodicity in the cycle component. We also take account of the cycle

variance moderation in postwar U.S. unemployment in the different models. The fifth

section concludes.

2.2 Unobserved components models

In this section we introduce UC time series models. The established notation following

Harvey (1989) and Durbin and Koopman (2001) will be used throughout the chapter.

The univariate UC time series model that is particularly suited for many economic data

sets is given by

yt = µt + γt + ψt + εt, εt ∼ NID(0, σ2ε), t = 1, . . . , n, (2.1)

where µt, γt, ψt and εt represent trend, seasonal, cyclical and irregular components, re-

spectively. The trend and seasonal components are modelled by linear dynamic stochas-

tic processes driven by random disturbances. The cycle is based on a stochastic trigono-

metric function that relies on a damping factor, frequency and random disturbances.

Page 16: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

8 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

The simplest form of a UC model, the so-called local level model, is obtained from

equation (2.1) where γt and ψt are zero for all t. The trend can be specified as the

random walk process

µt+1 = µt + ηt, ηt ∼ NID(0, σ2η), (2.2)

for t = 1, . . . , n. By adding a slope term βt, that is also generated by a random walk

to equation (2.2), we obtain the so-called local linear trend model,

µt+1 = µt + βt + ηt, ηt ∼ NID(0, σ2η),

βt+1 = βt + ζt, ζt ∼ NID(0, σ2ζ ),

(2.3)

for t = 1, . . . , n, where the trend and slope disturbances, ηt and ζt, respectively, are

mutually uncorrelated sequences from a Gaussian density with zero mean and variance

σ2η for ηt and σ2

ζ for ζt. If σ2ζ is zero, we have ζt = 0 and βt+1 = βt = β for all t.

This implies a random walk plus drift process for the trend µt. When σ2η is zero as

well, a deterministic linear trend is obtained for µt. If only σ2η is zero, then we have a

so-called smooth trend model or an integrated random walk process. This implies that

∆µt follows a random walk process where ∆ = 1 − L is the difference operator and L

is the lag operator with Lpyt = yt−p.

To take account of the seasonal variation in the time series yt, the seasonal com-

ponent γt is included. A deterministic seasonal component should have the property

that it sums to zero over the previous year to ensure that it is not confounded with the

trend. Flexibility of the seasonal component is achieved when it is allowed to change

over time. This can be established by adding a disturbance term (with mean zero) to

the sum of the S seasonal effects over the past year. In this way we obtain the stochastic

dummy variable form of the seasonal component as given by

SS(L)γt+1 = ωt, ωt ∼ NID(0, σ2ω), (2.4)

where SS(L) is the summation operator defined as SS(L) = 1 + L+ · · ·+ LS−1. Other

(trigonometric) models for γt can be considered in this set-up, but they all share the

assumption that SS(L)γt has a zero conditional expectation; see Section 5.2. Since

economic time series are often subject to cyclical dynamics, a stochastic cycle may be

included in the model and be specified as(ψt+1

ψ∗t+1

)= ρ

(cosλ sinλ

− sinλ cosλ

)(ψt

ψ∗t

)+

(κt

κ∗t

), 0 < ρ < 1, 0 < λ < 2π/S, (2.5)

where the period of the cycle is given by 2π/λ, with λ < 2π/S to avoid confounding

with the seasonal component. This dynamic process can be written as an ARMA(2,1)

process with complex roots in the autoregressive polynomial. It therefore generates

Page 17: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 9

a cyclical pattern in the theoretical autocorrelation function of process (2.5); Harvey

(1985) provides more details. Although the average length of the cycle is fixed, individ-

ual realisations of the cycle can show considerable variation in length and amplitude.

2.3 Periodic unobserved components models

Univariate seasonal time series yt are considered with seasonal length S (S = 12 for

monthly data). Seasonal time series are often analysed by seasonal autoregressive mov-

ing average (SARMA) models and by the UC models of the previous section. The

standard formulations of these models assume that all parameters are constant through

time. In case the models are periodic, the parameters are allowed to vary with the

season. As a result, the number of parameters increases by a multiple of S. This sec-

tion develops a statistical periodic time series approach for univariate UC models with

trend, season, cycle and irregular.

We start our analysis with the simplest periodic basic structural model (BSM)

which contains only three components: trend, season and irregular. Considering the

moments of this model, we show that the univariate time-varying-parameter BSM does

not correspond to a multivariate constant-parameter BSM. We also introduce a novel

periodic stochastic cycle component. Further we derive the moments and show that

exact ML estimation can be implemented without additional identifying restrictions.

2.3.1 Periodic basic structural time series model

Consider a univariate basic structural time series model (BSM) with periodic variances

for the disturbances associated with trend, seasonal and irregular components. This

model can be expressed by

yt = µt + γt + εt, εt ∼ NID(0, σ2ε,s),

µt+1 = µt + ηt, ηt ∼ NID(0, σ2η,s),

γt+1 = −∑S−2

j=0 γt−j + ωt, ωt ∼ NID(0, σ2ω,s),

(2.6)

for t = 1, . . . , n, n = n∗S, where σ2ε,s, σ

2η,s and σ2

ω,s are the variances for εt, ηt and ωt

respectively, and for season s = 1, . . . , S. To simplify notation for the multivariate

representations of the model, we assume we have n∗ complete years of data, but this

assumption is not essential for the subsequent statistical analysis.

Once model (2.6) is written as a multivariate process (each equation is for a particu-

lar season), it is shown below that the periodic BSM (2.6) does not reduce to a standard

multivariate local level model for yearly observations. Denote ys,t∗ as the observation

for period s and year t∗ such that yt ≡ ys,t∗ , where t = (S − 1)t∗ + s for t = 1, . . . , n∗S,

Page 18: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

10 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

t∗ = 1, . . . , n∗ and s = 1, 2, . . . , S. For the case of S = 2 we have

y1,t∗ = µ1,t∗ + ε1,t∗ , y2,t∗ = µ2,t∗ + ε2,t∗ (2.7)

where

µ1,t∗ = µt + γt, µ2,t∗ = µt+1 + γt+1, ε1,t∗ = εt, ε2,t∗ = εt+1, (2.8)

so that the trend becomes

µ1,t∗+1 = µ1,t∗ + η1,t∗ , µ2,t∗+1 = µ2,t∗ + η2,t∗ . (2.9)

The trend disturbances in (2.9) include the seasonal disturbances by construction, since

it follows that

η1,t∗ = ηt + ηt+1 − ωt + ωt+1, η2,t∗ = ηt+1 + ηt+2 − ωt+1 + ωt+2. (2.10)

In matrix form, the above model can be written as

y∗t∗ = µ∗t∗ + ε∗t∗ , (2.11)

µ∗t∗+1 = µ∗t∗ + η∗t∗ , (2.12)

where we denote the vectors as y∗t∗ = (y1,t∗ , y2,t∗)′, µ∗t∗ = (µ1,t∗ , µ2,t∗)

′, ε∗t∗ = (ε1,t∗ , ε2,t∗)′

and η∗t∗ = (η1,t∗ , η2,t∗)′. The variance matrix of the total disturbance vector

(ε1,t∗ , ε2,t∗ , η1,t∗ , η2,t∗)′ is given by

σ2ε,1 0 0 0

0 σ2ε,2 0 0

0 0 σ2η,1 + σ2

η,2 + σ2ω,1 + σ2

ω,2 σ2η,2 − σ2

ω,2

0 0 σ2η,2 − σ2

ω,2 σ2η,1 + σ2

η,2 + σ2ω,1 + σ2

ω,2

. (2.13)

Further,

E

[(η1,t∗+1

η2,t∗+1

)(η1,t∗

η2,t∗

)′]=

(0 σ2

η,1 − σ2ω,1

0 0

)(2.14)

and

E

[(η1,t∗+j

η2,t∗+j

)(η1,t∗

η2,t∗

)′]= 02×2 for j > 1. (2.15)

Since (2.14) is not a zero matrix, the level disturbance vector η∗t∗ follows a moving

average process. Therefore, we do not obtain a standard multivariate version of the

local level model with a serially independent sequence of η∗t∗ . Unfortunately, it means

we cannot estimate this simple model using standard software for multivariate basic

structural models. This is a disadvantage compared to other PUC models, see the

discussion in Koopman and Ooms (2006).

Page 19: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 11

2.3.2 Convenient state space representation of periodic BSM

By focussing on the moments of yt implied by the periodic BSM model in equation (2.6),

we have shown that univariate and multivariate representations of the periodic model

are not necessarily equivalent. Here we develop two convenient ways of placing a PUC

model into state space form: a univariate time-varying and a multivariate time-invariant

representation. Both formulations enable exact maximum likelihood estimation and

the estimation of the state vector by filtering and smoothing. The general state space

representation is summarised in Appendix 2.A, while the Kalman filter, smoother, and

the likelihood evaluation are summarised in Appendix 2.B, 2.C and 2.D, respectively.

We derive the second order moments of the periodic BSM for S = 2 in Appendix 2.E

and for S = 3 in Appendix 2.F and we argue that not all parameters of PUC models

are automatically identified.

PUC models can be formulated via a univariate measurement equation (N = 1) and

time varying system matrices Tt, Ht, Zt and Gt for t = 1, . . . , n. Alternatively they can

be represented by a multivariate measurement equation (N = S) for y∗t∗ with constant

system matrices T , H, Z and G. Next we discuss these two convenient state space

representations for the periodic BSM (2.6) and based on (2.35)-(2.36).

The state space matrices of the univariate time-varying parameter form of (2.6) for

N = S = 2 are given by

T =

(1 0

0 −1

), Ht =

(0 ση,t 0

0 0 σω,t

), (2.16)

Z =(

1 1), Gt =

(σε,t 0 0

), (2.17)

where ση,t, σω,t and ση,t vary deterministically according to the season. Note that the

matrices Ht and Gt are time-varying, while T and Z are constant over time. The state

vector is given by αt = (µt, γt)′ and the disturbance vector is given by εt = (εt, ηt, ωt)

′.

The initialisation of αt is diffuse with a is a vector of 0’s and P is κI2 with κ→∞.

The multivariate time-invariant state space form of model (2.6) can be written as:

α∗t∗+1 = T ∗α∗t∗ +H∗ε∗t∗ , α∗1 ∼ N (a∗, P ∗) , t∗ = 1, . . . , n∗, (2.18)

y∗t∗ = Z∗α∗t∗ +G∗ε∗t∗ , ε∗t∗ ∼ NID(0, I). (2.19)

To simplify notation we consider model (2.6) for S = 2, where y∗t∗ = (yt, yt+1)′, t =

1, 1 + S, 1 + 2S, . . . , 1 + (n∗ − 1)S. We derive convenient expressions for α∗t∗ , ε∗t∗ , T

∗,

H∗, Z∗ and G∗ as follows. The measurement equations are given by

yt = µt + γt + εt, εt ∼ NID(0, σ2ε,1),

yt+1 = µt+1 + γt+1 + εt+1 = µt + ηt − γt + ωt + εt+1, εt+1 ∼ NID(0, σ2ε,2).

Page 20: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

12 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

Further, we take α∗t∗ = (µt, ηt, γt, ωt)′ as the state vector and it follows from the

transition equations that

µt+2 = µt+1 + ηt+1 = µt + ηt + ηt+1, ηt+1 ∼ NID(0, σ2η,2),

γt+2 = −γt+1 + ωt+1 = γt − ωt + ωt+1, ωt+1 ∼ NID(0, σ2ω,2).

The state space matrices are then given by

T ∗ =

1 1 0 0

0 0 0 0

0 0 1 −1

0 0 0 0

, H∗ =

0 0 ση,2 0 0 0

0 0 0 ση,1 0 0

0 0 0 0 σω,2 0

0 0 0 0 0 σω,1

, (2.20)

Z∗ =

(1 0 1 0

1 1 −1 1

), G∗ =

(σε,1 0 0 0 0 0

0 σε,2 0 0 0 0

), (2.21)

with vector ε∗t∗ = (εt, εt+1, ηt+1, ηt, ωt+1, ωt)′ for t∗ = 1, 2, . . . , n∗ and t = 1, 1 + S, 1 +

2S, . . . , 1 + (n∗ − 1)S so that α∗t∗+1 = (µt+2, ηt+2, γt+2, ωt+2)′. The initialisation of

the state vector is diffuse for µ1 and γ1, while η1 and ω1 should be initialised by their

variances. a∗ is a vector of 0’s and P ∗ in this case is given by the matrix

P ∗ =

κ 0 0 0

0 σ2η,1 0 0

0 0 κ 0

0 0 0 σ2ω,1

, with κ→∞.

Note that the multivariate time-invariant system is observationally equivalent to the

univariate time varying system. In particular, the Gaussian likelihood of model (2.20)-

(2.21) is exactly equal to the likelihood of model (2.16)-(2.17). For both specifications,

it is clear that there is only one trend for the whole observed series. Although the

multivariate representation may suggest that we have a separate trend for each season,

the state vector α∗t∗ only has a single trend, µt, and a single seasonal component, γt.

The periodic BSM can be extended by the inclusion of additional components (such

as a cycle) or by increasing the seasonal length in the univariate version of PUC models.

The state space model with time-varying system matrices provides a general framework

for this purpose. However, when including a periodic stochastic cyclical component as

in subsection 2.3.3 below, special attention must be given to its initialisation. For

the multivariate representation of the PUC model with a cycle, the initialisation issue

is somewhat tedious but manageable. Only some elementary calculations in linear

algebra are required. The merit of the multivariate specification is its dependence on

time-invariant system matrices. It allows the examination of the dynamic properties of

the time series in a straightforward manner. The merit of the univariate specification

Page 21: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 13

is its straightforward initialisation treatment. The drawback is its dependence on time-

varying system matrices.

Finally, we focus on an identification problem related to periodic models with S = 2;

see Appendix 2.E for details. In case of bi-annual periodic models, the number of

unknown parameters is equivalent to the number of moment equations. However, the

rank condition is not satisfied. Fortunately, the reduced rank problem does not occur

when S > 2. For the periodic BSM in equation (2.6), we have S(S+1) linear equations

to estimate 3S parameters. It is clear that for S ≥ 3 we have more non-zero moment

equations than parameters. The order-condition for identification is therefore satisfied.

In practice, only 3S unique equations exist in the system and all parameters can be

identified exactly from these, see Appendix 2.F for technical details with S = 3. We

conclude that the parameters are exactly identified in the two standard PUC models for

all S > 2. Extension of the periodic BSM with a periodic stochastic cycle component

as defined in the empirical section does not lead to further identification problems.

2.3.3 Periodic stochastic cycle model

In this section we introduce a novel periodic version of the stochastic cycle component

as part of the UC time series models. Macroeconomic time series often require a cyclical

component in their specification in addition to the trend which is usually interpreted as

a business cycle. In multivariate UC models the cycle component can be identified as a

common stationary factor, where the corresponding factor loadings model the cyclical

sensitivity of each constituent series. This approach was taken by Krane and Wascher

(1999) and Azevedo et al. (2006). To identify the cycle in a univariate time series

the variation of the component has to be restricted to frequencies within the business

cycle range. Harvey (1985) has implemented this idea for UC models by the stochastic

cycle component. Krane and Wascher (1999) make their model periodic by allowing

the cyclical sensitivity to be seasonally dependent. Here we consider the cycle model

of Harvey (1985) in equation (2.5) and extend it by having ρ and σ2κ periodic. To save

space we only present the equations for a model with two periodic components, cycle

and irregular, and with S = 2. We discuss the extension with a seasonal component

below. We define for t = 1, S + 1, 2S + 1, . . . , (n∗ − 1)S + 1,

yt = ψt + εt, εt ∼ NID(0, σ2ε,1),

yt+1 = ψt+1 + εt+1, εt+1 ∼ NID(0, σ2ε,2),

(2.22)

Page 22: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

14 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

where(ψt+1

ψ+t+1

)= ρ1

(cosλ sinλ

− sinλ cosλ

)(ψt

ψ+t

)+

(κt

κ+t

), κt, κ

+t ∼ NID(0, σ2

κ,1),(ψt+2

ψ+t+2

)= ρ2

(cosλ sinλ

− sinλ cosλ

)(ψt+1

ψ+t+1

)+

(κt+1

κ+t+1

), κt+1, κ

+t+1 ∼ NID(0, σ2

κ,2),

(2.23)

with mutually uncorrelated white noise disturbances (κt, κt+1) and (κ+t , κ+t+1). A re-

striction on the damping terms 0 <∏S

s=1 ρs < 1 ensures that the stochastic process ψt

is stationary. The periodic damping terms allow the cyclical sensitivity of yt to differ

from season to season, as in Krane and Wascher (1999).

The frequency of the stochastic cycle, λ, is common to all seasons s and ranges

between 0 and 2π/S. This implies that the average period of the cycle is 2π/λ (as

measured in semesters for S = 2, quarters for S = 4 and months for S = 12) while it is

equal to 2π/(λS) in years. By making λ periodic, the model becomes nonlinear while

we like to keep the focus on linear state space models. Also, the interpretation of the

cycle would be more difficult when each season has its own frequency. The main idea of

the PUC model is that we only have a common trend, seasonal, and cyclical component

and therefore we only need a common cycle frequency.

The dynamic properties of the stationary periodic cycle process are given by ex-

pressions for the variances and covariances of ψt and ψt+1. Following the approach of

Gladysev (1961) and Tiao and Grupe (1980) these are derived from their vector au-

toregressive representation of order 1 and denoted by VAR(1). This yearly VAR(1)

is obtained by substituting the expression for (ψ+t+1, ψ

+t+1)

′ in the second equation of

(2.23). We get:

Ψt∗+1 = ΦΨt∗ + κt∗ , (2.24)ψt+2

ψ+t+2

κt+2

κ+t+2

=

ρ1ρ2 cos 2λ ρ1ρ2 sin 2λ ρ2 cosλ ρ2 sinλ

−ρ1ρ2 sin 2λ ρ1ρ2 cos 2λ −ρ2 sinλ ρ2 cosλ

0 0 0 0

0 0 0 0

ψt

ψ+t

κt

κ+t

+

κt+1

κ+t+1

κt+2

κ+t+2

,

for t∗ = 1, 2, . . . , n∗ and for t = 1, S + 1, 2S + 1, . . . , (n∗ − 1)S + 1. First we derive the

variance covariance matrix Λ0 of Ψt∗ :

E[Ψt∗+1Ψ′t∗+1] = Φ E[Ψt∗Ψ

′t∗ ]Φ

′ + Σκ ⇔Λ0 = ΦΛ0Φ

′ + Σκ,

where

Σκ = diag(σ2κ,2 σ2

κ,2 σ2κ,1 σ2

κ,1

)

Page 23: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 15

and

Λ0 =

σ2ψ,1 E(ψtψ

+t ) 0 0

E(ψtψ+t ) σ2

ψ,1 0 0

0 0 σ2κ,1 0

0 0 0 σ2κ,1

,

with

E(ψtψ+t ) = 0, σ2

ψ,1 =σ2κ,2 + ρ22σ

2κ,1

1− ρ21ρ22, (2.25)

E(ψt+1ψ+t+1) = 0, σ2

ψ,2 =σ2κ,1 + ρ21σ

2κ,2

1− ρ21ρ22. (2.26)

Subsequently, the periodic autocovariance function (ACVF) of yt for S = 2, t = 1, S +

1, 2S + 1, . . . and t∗ = 1, 2, . . . is expressed in terms of σ2ψ,s, σ

2ε,s, ρs and λ:

Γ0 = E

[(yt

yt+1

)(yt

yt+1

)′]= E

[(y1,t∗

y2,t∗

)(y1,t∗

y2,t∗

)′]=

(σ2ψ,1 + σ2

ε,1 ρ1 cos(λ)σ2ψ,1

ρ1 cos(λ)σ2ψ,1 σ2

ψ,2 + σ2ε,2

),

(2.27)

Γ1 = E

[(yt

yt+1

)(yt−2

yt−1

)′]= E

[(y1,t∗

y2,t∗

)(y1,t∗−1

y2,t∗−1

)′]

=

(ρ1ρ2 cos(2λ)σ2

ψ,1 ρ2 cos(λ)σ2ψ,2

ρ21ρ2 cos(3λ)σ2ψ,1 ρ1ρ2 cos(2λ)σ2

ψ,2

), (2.28)

...

Γj = E

[(yt

yt+1

)(yt−2j

yt+1−2j

)′]= E

[(y1,t∗

y2,t∗

)(y1,t∗−j

y2,t∗−j

)′]

=

(ρj1ρ

j2 cos(2jλ)σ2

ψ,1 ρj−11 ρj2 cos((2j − 1)λ)σ2ψ,2

ρj+11 ρj2 cos((2j + 1)λ)σ2

ψ,1 ρj1ρj2 cos(2jλ)σ2

ψ,2

), (2.29)

for j = 1, 2, . . . . This means that the ACVF of yt in the second period of a year at a lag

of 2j+k semesters, with k = 0, 1 and j = 0, 1, 2, . . ., is given by ρj+11 ρj2 cos((2j+k)λ2ψ,1

),

see the left bottom element of the last right hand side matrix. The two complex roots

of the AR part of the periodic stochastic cycle model are restricted to reflect business

cycle behaviour, but the autocovariance function also shows an extra seasonal pattern

that interacts with these cycles if ρ1 and ρ2 differ sufficiently.

The autocovariance function of yt has a nonlinear structure in terms of σψ,s, ρs and

λ with s = 1, 2. Since σψ,s also depends on σκ,s and ρs, the structure of the ACVF

becomes intricate. As a result, identification can not be analysed analytically and we

Page 24: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

16 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

therefore carry out some Monte Carlo experiments for Maximum Likelihood estimation

to investigate whether the parameters can be estimated from simulated data. Model

(2.22) is already in a standard time-varying state space form, which makes it very

suitable for exact ML estimation using the prediction error decomposition, as in Durbin

and Koopman (2001, Chapter 7). However, the construction of the exact likelihood

function of models with a periodic cyclical component involves one non-standard step.

The expression for σ2ψ,1 is needed for the exact initialisation of the likelihood for the

first observations, since (ψ1, ψ+1 ) is included in the corresponding state vector. The

other terms in the likelihood for t = 2, . . . follow in a standard way from the time-

varying Kalman filter equations as in Durbin and Koopman (2001, Chapter 7). The

multivariate representation (2.24) is only used in the derivation of the variance of the

initial state.

The extension of the periodic cycle model to a general S = 2, 3, . . . is straightfor-

ward. The Φ matrix in the multivariate VAR(1) form (2.24) becomes∏S

s=1 ρsC(Sλ)∏S

s=2 ρsC((S − 1)λ) . . . ρSC(λ)

O2(S−1)×2S

, (2.30)

where

C(sλ) =

(cos sλ sin sλ

− sin sλ cos sλ

),

for s = 1, . . . , S and the general solution for σ2ψ,1 in (2.25) becomes

σ2ψ,1 =

∑S−1s=1

{σ2κ,s

[∏Sj=s+1 ρ

2j

]}+ σ2

κ,S

1−∏S

s=1 ρ2s

. (2.31)

The degree of non-stationarity or smoothness of the model based trends and cycles

in (2.1) is sometimes viewed as too restrictive, see Gomez (2001) and Azevedo et al.

(2006). A similar remark could be made regarding our PUC model. Higher order

extensions for stochastic trends and cycles in the UC model were introduced by Harvey

and Trimbur (2003) to allow for nearly ideal band pass filter properties in model-based

component estimation. In the time domain, higher order trends and cycles lead to

smoother estimates of cycle and trend. However, a periodic analysis of the higher order

models falls outside the scope of this chapter.

In larger models combining the periodic stochastic cycle component with periodic

stochastic trends and seasonals, we have to make sure we exclude extreme cases where

the parameters ρs are nearly one and λ is very close to 2π/S, to avoid confusion with

the seasonal component (2.4). We also need S larger than 2 to identify the periodic

variances of the trend component (2.3) and seasonal component (2.4). We did not

encounter such identification problems in our empirical application of Section 2.4.

Page 25: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 17

2.3.4 Simulation study on the periodic cycle component

Monte-Carlo experiments to study the finite sample behaviour of the exact ML estima-

tor are implemented for different values of the parameters σκ,1, σκ,2, ρ1, ρ2 and λ. We

look at the distributions of the exact ML estimators of these parameters running the

experiments 10000 times with 100 observations (equal to 50 years for S = 2). A smaller

number of observations is not sufficient to produce reliable estimates in the simulation

study. All computations and graphs in this chapter are made using recent versions of

SsfPack, see Koopman et al. (1999) and programming language Ox, see Doornik (2006).

Figure 2.1 presents the simulated densities for one representative set of parameter

values which are based on empirical estimates for semi-annual postwar data of log

U.S. unemployment, see Section 2.4 The true parameters are given by σκ,1 = 0.03,

σκ,2 = 0.06, ρ1 = 0.95, ρ2 = 0.70 and λ = 0.3. The simulated means are close to the

true values and the simulated densities are approximately normal. This confirms that

the stochastic periodic cycle component does not lead to identification problems for

parameter estimation.

0.00 0.01 0.02 0.03 0.04 0.05

25

50σκ ,1

0.03 0.04 0.05 0.06 0.07 0.08

25

50σκ ,2

0.65 0.80 0.95 1.10 1.25

2.5

5.0ρ1

0.1 0.3 0.5 0.7 0.9 1.1

2

4 ρ2

0.0 0.1 0.2 0.3 0.4 0.5 0.6

2.5

5.0λ

Figure 2.1: Simulated histograms and non-parametric density estimates of the exact ML

parameter estimates in periodic cycle model (2.22) with true values: σκ,1 = 0.03, σκ,2 = 0.06,

ρ1 = 0.95, ρ2 = 0.70 and λ = 0.3. σε,1 and σε,2 are fixed at .0005 and .0001 and not estimated.

Based a on Monte-Carlo experiment of 10000 replications with 100 observations.

Page 26: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

18 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

The use of a PUC model generally entails a loss of estimation efficiency when the

data generating process (DGP) is not periodic. Even when the DGP is correctly mod-

elled by a PUC model, a misspecified non-periodic model can still yield estimates with

lower mean squared error due to the higher sampling error in the PUC specification.

This depends on both the sample size and the periodic variation in the true parameters.

We investigate the loss of efficiency by simulating from a PUC model where the true

values of the parameters ρs and σκ,s are close, comparing the root mean squared error

(RMSE) of the estimators of the correctly specified PUC model and the misspecified

restricted model. For completeness, we also compute the results of the estimators for

non-periodic true parameter values.

In case of a PUC model, we define the overall RMSE of a by

RMSE(a) =

√√√√ 1

S

S∑s=1

MSE(as), (2.32)

where a represents a particular parameter (such as σκ, ρ and λ), a is the maximum

likelihood estimator of a and as refers to the periodic estimator of a for period s.

Panel A of Table 2.1 shows the RMSEs of the estimates for four pairs of true values

for the parameters ρ1 and ρ2 while the other parameters, σκ,1, σκ,2 and λ, are fixed as

in the Monte Carlo study of Figure 2.1. We repeat the Monte-Carlo experiment for

different values of σκ,s, s = 1, 2, with ρ1, ρ2, and λ fixed. Panel B of Table 2.1 shows

the corresponding RMSEs of the restricted and unrestricted estimators.

The restricted non-periodic parameter estimates of ρs and σκ,s are more accurate

when the parameters of the model are close to non-periodic, ρ1 = 0.95 and ρ2 = 0.9 or

σκ,1 = 0.055 and σκ,2 = .06, and vice versa when ρ1 = 0.95 and ρ2 = 0.70 or σκ,1 = 0.04

and σκ,2 = 0.06. We also report the precision of the model based estimators of the

(periodic) first order autocorrelation τ(1)s, derived in (2.27), e.g.

τ(1)1 =(ρ1 cos(λ)σ2

ψ,1

)/((σ2

ψ,1 + σ2ε,1)(σ

2ψ,2 + σ2

ε,2))1/2

.

The periodic estimator of τ(1) generally outperforms in Panel A.

In each replication, we also test for periodicity using an LR test with nominal 5%

significance. The empirical rejection frequency is close to 5% for the nonperiodic DGP.

In Panel A, when the RMSE of the unrestricted estimator of ρs is just larger than

the RMSE of the restricted estimator for ρ1 = 0.95, ρ2 = 0.9, the LR test rejects the

constancy of ρ in 9% of the cases. For ρ1 = 0.95, ρ2 = 0.8, where both estimators

yield nearly the same RMSE, the LR test rejects the non-periodicity in 26% of the

replications. A similar pattern emerges in Panel B for σκ,s. The unrestricted estimator

outperforms the restricted estimator when |σκ,1 − σκ,2| is large enough, see the column

Page 27: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 19

for (σκ,1 = 0.04, σκ,2 = 0.06). In this case the LR-test rejects the null of non-periodic

σκ,s’s in 54% of the replications.

This small Monte Carlo study shows that unrestricted periodic estimators are more

precise for realistic periodic parameter values of the DGP and a relevant number of

observations. Moreover, tests for nonperiodicity have substantial power in these cases.

2.3.5 Periodic stochastic cycles and seasonal adjustment

Consider a model combining a seasonal component γt as in (2.4) and a periodic stochas-

tic cycle component ψt as in (2.23). One might wonder what the seasonal component

γt represents when the ACVF of the cycle component ψt shows seasonal patterns too.

We agree with Bell and Hillmer (1984, p.292) that “a seasonal adjustment be consis-

tent with the information about seasonality present in the data being adjusted” and

with Canova and Ghysels (1994, p.1169) that “cataloguing business cycle facts with

seasonally adjusted data is improper unless the seasonal adjustment takes into account

the particular form of interactions existing among the components of the series”. On

the other hand, we also want to address the issue that Cecchetti et al. (1997, p.891)

described as “the basic issue of whether the interaction term should be treated as sea-

sonal or cyclical, and, at a more fundamental level, whether seasonal adjustment makes

sense at all, when seasonals and cycles do not neatly decompose”.

The essential identifying restriction for the seasonal component γt lies in its condi-

tional expectation of zero over a year. The seasonality implied by the cycle ψt moves up

and down with the business cycle and is not restricted to sum to zero over a year and

will not be picked up by estimates of γt if the model is correctly specified. Estimates of

the seasonal component γt in an approximately correctly specified model do make sense

and estimates can be used for seasonal adjustment, even if there is periodic interaction

between seasonal and cyclical movements in the data.

The seasonal adjustment method associated with our model is linear and straight-

forward to analyse. Standard theory of filtering and smoothing in state-space models

provides algorithms to compute the minimum mean squared error estimator of the sea-

sonal component. If the periodicity in the cycle is neglected, the model is misspecified

and the annual changes in the estimated component γt reflect annual changes in the

business cycle which make the seasonal estimates more difficult to interpret. Ooms

and Franses (1997) showed the significance of this ‘problematic’ interaction for sev-

eral multiplicative non-periodic seasonal adjustment methods for U.S. unemployment.

Application of the periodic stochastic cycle mitigates this problem considerably. The

reduction in the variance of the seasonal component is in line with the maintained

criterion of gradual year-on-year changes in seasonal adjustment factors implemented

in automatic decomposition methods developed by the U.S. Census since ‘Method II’.

Page 28: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

20 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

Tab

le2.1:

Sim

ulation

results

ofrestricted

andunrestricted

MLestim

atorsfor

DGP

(2.22)-(2.23)

Pan

elA

:(ρ

1 ,ρ2 )

(0.95,0.70)(0.95,0.80)

(0.95,0.90)(0.95,0.95)

Model

PU

CU

CP

UC

UC

PU

CU

CP

UC

UC

RM

SE

(σκ )

.0070.0071

.0070.0070

.0070

.0069.0

069

.0068

RM

SE

(ρ)

.0931.1340

.0805.0841

.0662

.0407.0

590

.0321

RM

SE

(λ)

.0869.0882

.0603.0605

.0401

.0400.0

314

.0313

RM

SE

(τ(1))

.0621.0682

.0540.0591

.1109.1434

.3158.3554

LR

-testrejects

ρ1

=ρ2

47%26%

9%6%

Pan

elB

:(σ

κ,1 ,σ

κ,2 )

(.04,.06)(.05,.06)

(.055,.06)(.06,.06)

Model

PU

CU

CP

UC

UC

PU

CU

CP

UC

UC

RM

SE

(σκ )

.0074.0111

.0076

.0071.0

078

.0058.0

080

.0054

RM

SE

(ρ)

.0928.0940

.0925

.0924.0

925

.0920.0

927

.0920

RM

SE

(λ)

.0945.1033

.0988.1010

.1002.1002

.1010

.1002

RM

SE

(τ(1))

.0716.0838

.0708

.0679.0

704

.0637.0

700

.0625

LR

-testrejects

(σκ,1

=σκ,2 )

54%18%

8%6%

Notes:

We

report

the

RM

SE

sof

un

restrictedan

drestricted

ML

estimato

rs(P

UC

an

dU

C)

for

eight

DG

Ps

(2.22)-(2.23)w

ithd

ifferen

tvalu

esfor

(ρ1 ,ρ

2 )

and

(σκ,1 ,σ

κ,2 ).

Th

erestricted

specifi

cation

UC

isco

rrectfo

rρ1

=ρ2

=0.9

5(P

an

elA

)an

dσκ,1

=σκ,2

=0.06

(Pan

elB

).U

Cis

incorrect

and

PU

C

correctfo

rth

eoth

erD

GP

s.T

he

remain

ing

para

meters

areσκ,1

=.0

3,σκ,2

=0.0

6(P

anel

A),ρ1

=0.95,

ρ2

=0.70

(Pan

elB

),an

=0.3,

σε,1

and

σε,2

are

fixed

at.0

005

an

d.0

001an

dn

otestim

ated.

Th

enu

mb

erof

replica

tion

sis

10,0

00

an

dn

=100.

Bold

nu

mb

ersin

dicate

that

the

un

restricted

estimato

ris

lessaccu

rate.L

astrow

sof

pan

elsrep

ort

rejection

frequ

encies

of

LR

testsofρ1

=ρ2

andσκ,1

=σκ,2 ,

respectively.

Page 29: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.3. PERIODIC UNOBSERVED COMPONENTS MODELS 21

Shiskin and Eisenpress (1957) presented an early discussion of the Census Method II

focusing on this topic. Ladiray and Quenneville (2001) gave an insightful overview of

the influential version X-11 of the Census method. The reduction in the variation of

the seasonal component is also crucial for the canonical seasonal component estimate

developed by Hillmer and Tiao (1982) which “follows as stable a pattern as possible

and is as predictable as possible”, see Bell and Hillmer (1984, p.305). The popular

seasonal adjustment programs TRAMO and SEATS of Gomez and Maravall (1996) use

canonical seasonal adjustment. An adapted version of TRAMO/SEATS for the Census

X12-ARIMA program version 0.3 is described in Census (2007).

As seasonal adjustment is intended to leave all cyclical fluctuations in the data,

we can recommend the use of a PUC model for this purpose. The recognition of

the difference between evolutionary and cyclical changes in the seasonal pattern exists

since the beginning of econometrics. The analysis of cyclical variations in the seasonal

pattern of economic time series, employment and business failures in particular, dates

back to the 1920’s and 1930’s, see Wisniewski (1934, p.176) who observed that “only

the following students have worked on it: Akerman of Sweden, Gjermoe of Norway

and Kuznets of the United States”, referring to Gjermoe (1931) and Kuznets (1933).

Mendershausen (1937) provided a timely critical literature review and stressed the

importance of making a distinction between cyclical factors and evolutionary tendencies

in seasonal indices. Recently, Van Dijk et al. (2003) coined the terms ‘Gjermoe-type’ for

cyclical changes and ‘Kuznets-type’ for evolutionary movements in the seasonal indices

to honour the pioneers in the study of these phenomena.

In an influential study, Barsky and Miron (1989) revived the economic interest

in analysing common features in seasonal and cyclical dynamics by analysing a wide

range of seasonally unadjusted U.S. data. Beaulieu and Miron (1993), Canova and

Ghysels (1994) and Canova and Hansen (1995) studied the gradual and cyclical changes

in seasonal patterns in aggregate U.S. macroeconomic data using parameter stability

tests in seasonal autoregressive models. Ghysels (1994) developed a periodic Markov

regime-switching model to study seasonal dependence of turning points in historical

U.S. business cycles. Cecchetti et al. (1997) provided intuitive economic explanations

of the interaction of seasonal cycles and business cycles and illustrated the empirical

importance of seasonal periodicity in the U.S. business cycle using industry level data on

production and inventories. Krane and Wascher (1999) provided examples of seasonally

varying cyclical changes in U.S. employment and showed that although the periodicity

in the correlation of employment with the business cycle component is not as clear

on a macroeconomic level as on an industry level, this periodicity is by no means

negligible. Van Dijk et al. (2003) and Matas-Mir and Osborn (2004) used smooth

transition autoregressive (STAR) models and threshold autoregressive (TAR) models to

Page 30: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

22 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

study the interaction of seasonal and cyclical changes in international industrial output

index series. These studies confirmed that cyclical changes in the seasonal pattern are

important for a wide range of series, more so for monthly than for quarterly data and

more so at the industry level than at the aggregate macro level. Evolutionary changes

in the seasonal indices are often more significant than cyclical changes, but empirical

models for periodic economic time series should capture both aspects.

Although our model can be used to estimate changing seasonal factors for periodic

time series, a full implementation for automatic seasonal adjustment would require

additional features: automatic model selection, indirect seasonal adjustment (based on

initial adjustment of sub-categories), benchmarking seasonally adjusted series to annual

totals, calendar effects and outliers, see Census (2007) for a recent description of these

features in the widely used Census X-12-ARIMA seasonal adjustment procedure. These

extensions might be added to our model in the future.

2.3.6 Parameter estimation and signal extraction

In the empirical applications below we adopt the PUC model given by

yt = µt + γt + ψt + εt, t = 1, . . . , n, (2.33)

where µt and γt are defined in (2.3) and (2.4), with disturbance variances σ2η,s, σ

2ζ,s, and

σ2ω,s, where ψt is defined in (2.23), where εt has variances σ2

ε,s and where all disturbances

are independent for s = 1, . . . , S and n = n∗S. Periodic trigonometric seasonality

instead of dummy variable seasonality can be specified in this framework by a set of

periodic stochastic cycles with fixed seasonal frequencies λj = 2πj/S, j = 1, . . . , S/2,

where the corresponding damping factors ρj,s can be fixed at one to allow for non-

stationary seasonality.

The parameters of the PUC models are estimated by Gaussian maximum likelihood.

The exact likelihood of the model is efficiently obtained by the Kalman filter based on

the prediction error decomposition of the Gaussian model. We use diffuse initialisa-

tions for non-stationary state elements to obtain the exact likelihood, see Durbin and

Koopman (2001). Further, we employ the Broyden-Fletcher-Goldfarb-Shanno (BFGS)

numerical optimisation algorithm to maximise with respect to the unknown parameters,

see Fletcher (1987). The number of parameters in periodic models is large for a high

value of S. However, this is not a problem in practice if the time series is sufficiently

long, as we illustrate below in subsection 2.4.1

Conditional on the ML estimates of the hyperparameters, ση,s, σζ,s, σω,s ρs, σκ,s and

λ, we compute the decomposition into trend, seasonal, cycle and irregular components

as

µt = E(µt|y1, . . . , yn), γt = E(γt|y1, . . . , yn), ψt = E(ψt|y1, . . . , yn),

Page 31: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 23

and

εt = E(εt|y1, . . . , yn)

respectively. The seasonally adjusted series is defined as yt − γt. The diagnostics are

based on the one-step-ahead forecast errors

vt = yt − E(yt|y1, . . . , yt−1)

that are scaled by their standard errors

s.e.(vt) = [E(v2t )|y1, . . . , yt−1]1/2.

The optimal linear out-of-sample forecasts are computed as

yn+i = E(yn+i|y1, . . . , yn), i = 1, 2, . . . .

We illustrate the effectiveness of estimation, signal extraction and forecasting in an

application to U.S. unemployment series in the following section.

2.4 Application to U.S. unemployment data

In this section we analyse a long monthly time series of U.S. unemployment data using

both periodic and non-periodic UC time series models. The data consists of aggregated

seasonally unadjusted monthly U.S. unemployment levels (in thousands of persons) for

age 16 years and over. The estimation sample, 1948.1-2005.12, has 696 observations. We

use 2006.1-2006.12 to evaluate forecasts. This series is published by the U.S. Bureau of

Labour Statistics (BLS), see http://www.bls.gov/webapps/legacy/cpsatab11.htm.

Figure 2.2 presents time series plots of log monthly unemployment, yt. The top

panel shows the data month by month, while the bottom panel presents the data in

multivariate form, year by year for each month. The monthly unemployment series

is slowly trending upwards and contains a pronounced cyclical pattern. Although the

plots of the series in the bottom panel do not contain seasonal movements, this does not

mean that the seasonal dependence has disappeared. The annual series are smoother

than the monthly series and clearly have common dynamics. Figure 2.2 also shows

a non-stationary yearly slope of the long-term trend in unemployment, but it is not

immediately clear that the changes in the slope vary by month of the year.

Figure 2.3 shows a selection of the periodic sample autocorrelations of annual

changes in U.S. unemployment growth rates, ∆∆12yt. An accurate definition of pe-

riodic autocorrelations is given by McLeod (1994) and detailed in Appendix 2.G. These

sample autocorrelation coefficients are clearly periodic. They differ significantly from

Page 32: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

24 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

7.5

8.0

8.5

9.0

9.5log(US unemployment)

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

7.5

8.0

8.5

9.0

9.5Y1 Y7

Y2 Y8

Y3 Y9

Y4 Y10

Y5 Y11

Y6 Y12

Figure 2.2: U.S. unemployment levels for age 16 years and over (in logs, seasonally unad-

justed), 1948.1 − 2005.12. Top: Monthly time series, Bottom: Annual time series for each

month of the year, s = 1, . . . , 12.

Page 33: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 25

12 24 36 48 60 72 84 96 108 120

−0.5

0.0

0.5

1.03PCor

12 24 36 48 60 72 84 96 108 120

−0.5

0.0

0.5

1.06PCor

12 24 36 48 60 72 84 96 108 120

−0.5

0.0

0.5

1.09PCor

12 24 36 48 60 72 84 96 108 120

−0.5

0.0

0.5

1.012PCor

Figure 2.3: Periodic autocorrelations (ACF) up to lags 120 months of ∆∆12yt (monthly

changes in the annual growth rates of U.S. unemployment, 1949.1− 2005.12). The top panels

show the periodic ACF for March and June while the bottom panels show the periodic ACF

for September and December.

Page 34: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

26 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

month to month. For example, for March (top left panel), there is a pronounced short

cyclical movement of approximately 24 months, while we see a much longer cyclical

movement for June (top right panel). There is also a significant difference between

March and June for the seasonal autocorrelation at lag 12 and a gap between the first

order autocorrelations of June and December. The periodicity in the autocorrelation

structure is the main motivation for periodic modelling of log U.S. unemployment. The

autocorrelations indicate the need for a periodic cyclical component and a periodic

seasonal component in time series models.

2.4.1 Periodic UC model for U.S. unemployment series

We have shown that U.S. unemployment series is subject to periodic dynamics and

therefore we continue our analysis by considering a PUC model, consisting of trend,

season, cycle and irregular components as given in (2.33). The disturbance variances

for all components are allowed to be periodic.

Further we cast the model into a state space form containing 6S + 1 fixed, but

unknown parameters, ση,s, σζ,s, σω,s, σκ,s, σε,s, ρs and λ for s = 1, . . . , S. The parameters

are estimated by numerically maximising the exact log-likelihood. In our case with

S = 12, we have to estimate 73 parameters. The parameters ση,s and σε,s are estimated

at zero for all seasons. For σζ,s we obtain one nonzero estimate at s = 9. We present

estimated components and associated parameters in Figures 2.4, 2.5, and 2.6.

Figure 2.4 presents the estimated trend component and the seasonally adjusted

series. The top panel shows the trend, µt, together with the unadjusted observations,

yt. In the bottom panel we show the trend and the seasonally adjusted data, yt−γt. The

trend is very smooth, it increases steadily until early 1980 and flattens out afterwards.

The difference between the two series in the bottom graph, yt−γt−µt can be interpreted

as the estimated cycle, ψt, as the irregular component, εt, is estimated as zero.

Figure 2.5 shows the estimated slope, βt, and the cyclical component, ψt. The

corresponding parameter estimates are shown underneath. The slope has a somewhat

surprising shape, jumping in discrete steps. This occurs because the slope is a random

walk with zero innovations for all months except October (s = 9). Further, the slope

remains stepwise periodic throughout the sample but it becomes nearly constant after

1982, which results in a smoother trend. The cyclical component shows large swings

throughout the whole sample.

The bottom panel of Figure 2.5 shows the estimated parameters for the trend and

cycle, where we present σζ,s, σκ,s and ρs for s = 1, . . . , 12. The estimated slope param-

eters, σζ,s, show a particular form of periodicity, they are all zero except for σζ,9. This

might be interpreted as a structural demographic change in the unemployment trend

due to newcomers on the job market in October (for example fresh-graduated students

Page 35: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 27

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

7.5

8.0

8.5

9.0

9.5log(Unemployment) trend

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

7.5

8.0

8.5

9.0

9.5seasonally adjusted series trend

Figure 2.4: Top panel – log of U.S. unemployment series with the estimated trend from the

PUC model. Bottom panel – the model based seasonally adjusted series together with the

estimated trend.

who start to look for a job), or as a structural labour saving technological change in

the production process. For example, Krane and Wascher (1999) mentioned that pro-

duction lines for new models in the U.S. motoring industry are usually introduced in

the fall. Note that the nonzero σζ,9 in the bottom left graph of Figure 2.5 corresponds

to the slope in month 10, hence October, as βt+1 = βt + ζt with ζt ∼ NID(0, σ2ζ,s).

We also estimated the PUC model with a non-periodic trend variance as the evidence

for periodicity in this parameter is not very strong. The maximised log-likelihood and

other parameter estimates only slightly changed. Therefore we only report the periodic

trend specification results as they clearly illustrate the mechanics of PUC modelling.

The damping factors of the cycle, ρs, look periodic as they have different values for

each month and vary between 0.8 and 1.1. Despite the fact that some ρ’s are estimated

to be larger than one, the cyclical component is still stationary as the product of all

the ρ’s is less than one. The lowest estimate of ρ is found in June (ρ5) and the highest

in December (ρ11). Note that ρ1 is linked with the transition equation for February

and ρ12 appears in the equation for January. The shocks in the cyclical component

have periodic standard errors that vary between 0.03 and 0.07 throughout the year. To

conclude our discussion of the trend and cycle estimates we report that λ = 0.049 with

a standard error of 0.013. This corresponds to an estimate of the average cycle length

of 10.7 with a standard error of 2.8 years.

Page 36: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

28 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

Our U.S. unemployment series aggregates unemployment levels from different sec-

tors. To find out which sector determines the large periodicity in cyclical unemployment

components reflected in low ρ5 and high ρ11, we should probably analyse disaggregate

unemployment data as discussed above in subsection 2.3.5 A multivariate model might

be used to distinguish the sectors with dominant periodic cyclical features, but this is

beyond the scope of this study.

1960 1980 2000

0.000

0.002

0.004slope

1950 1960 1970 1980 1990 2000

−0.5

0.0

0.5

cycle

2 4 6 8 10 12

−0.005

0.000

0.005

0.010σζ

2 4 6 8 10 12

0.02

0.04

0.06

σκ

2 4 6 8 10 12

0.8

0.9

1.0

1.1

ρ

Figure 2.5: Estimated slope and cycle components (top row) and the corresponding periodic

parameters ± 2 standard errors (bottom row) for the PUC model.

Figure 2.6 provides a closer look at the estimate of the periodic seasonal component

γt. The top panel shows γt throughout the whole sample period and the middle panel

shows the annual sub-plots for each month separately. The seasonal pattern for May,

June and July turns out to be more volatile than for other months. This is especially

clear during the years 1965 - 1970 as one can see from the top panel. As intended, the

estimated changes in the seasonal pattern seem to be unrelated to estimated changes

in the cyclical component or the trend component. The bottom panel shows how the

periodic volatility in the seasonal component is reflected in the standard errors of the

seasonal shocks, σω,s.

Note that the standard errors of the two other possible types of shocks in the model,

σε,s and ση,s, are excluded from the graphs as they are estimated as zero and subse-

quently fixed. The remaining 73 − 24 = 49 parameters are estimated unrestrictedly.

Also note that each estimated parameter has its own standard deviation so that the

confidence bands around these parameters change by month of the year too.

Page 37: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 29

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

−0.1

0.0

0.1

0.2seasonal

1 2 3 4 5 6 7 8 9 10 11 12

−0.1

0.0

0.1

0.2seasonal

1 2 3 4 5 6 7 8 9 10 11 12

0.00

0.01

0.02 σω

Figure 2.6: Monthly seasonal component. Top: γt, t = 1, . . . , 696, Middle: γs,t∗ , t∗ =

1, . . . , 58, s = 1, . . . , 12. Bottom: σω,s ± 2 s.e., s = 1, . . . , 12.

In order to formally test whether the ρ’s and σκ’s are periodic we also estimate the

model with additional restrictions. The first set of restrictions imposes that all ρ’s are

equal. This results in a more volatile trend estimate and in a shorter average period for

the cycle, namely 5.3 years instead of the 10.7 years we obtained for the unrestricted

model. The second set of restrictions imposes that all σκ’s are equal. This also results

in a more volatile trend estimate and in an even shorter cyclical period of 3.2 years.

We observe that if ρ or σκ is restricted to be non-periodic, we obtain a large λ while if

ρ’s and σκ’s are both unrestricted, we get a small λ, and therefore a longer average length

of the cycle. The (partly) non-periodic UC model attributes a different proportion of

the cyclical variation to the trend component. This illustrates that periodic and non-

periodic UC models can imply different empirical decompositions of time series.

Table 2.2 compares the likelihoods of the periodic and (partly) non-periodic models.

The total number of parameters in the PUC model is 38 as we do not count the 11 zeros

in the trend variances. The LR-tests for nonperiodicity in the cyclical component are

highly significant (compared to the χ211 critical value for a significance level α = 0.05)

which indicates that all ρs and σκ,s are periodic indeed. An additional LR statistic

examines the overall periodicity of the PUC model, testing whether all variances and

damping factors are equal throughout the year against our periodic alternative. The

Page 38: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

30 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

test clearly rejects at 5% significance level. Table 2.2 confirms the conclusion from

the data analysis and the graphical assessment of the time series decomposition that

periodicity is statistically significant in U.S. unemployment series.

Table 2.2: Likelihood Ratio tests for periodicity in UC models (sample 1948− 2005)

Model logL p LR k χ2k;0.05

H1,1 : PUC model 1116.13 38

H0,1 : all σκ,s equal 1101.28 27 29.70∗ 11 19.68

H0,2 : all ρs equal 1090.69 27 50.88∗ 11 19.68

H0,3 : all PUC parameters are non-periodic 1061.28 5 109.70∗ 33 47.39

Notes: logL = log likelihood. LR = LR test against H1,1. p is the number of estimated parameters.

k is the number of restricted parameters. H1,1 : PUC model in (2.33), σ2η,s = σ2

ε,s = 0, s = 1, . . . , 12.∗ : H0,i is rejected at a nominal 5% significance level.

Figure 2.7 displays graphical diagnostics for the PUC model. The top panel shows

the standardised one-step ahead prediction errors and the bottom panel shows the

sample autocorrelation function. Most aspects of these diagnostics appear to be satis-

factory, but there is one notable exception: the forecast errors are clearly more volatile

in the first half of the series compared to the second half. This is an important and

well-known empirical finding that we address in the next subsection.

2.4.2 Cycle variance moderation in U.S. unemployment series

It appears from Figure 2.7 that the forecast error variance for U.S. unemployment

series is much lower after 1980 than before. This structural volatility change is a well

documented stylised fact occurring in several U.S. macroeconomic time series, see, Kim

and Nelson (1999), McConnell and Perez-Quiros (2000), Stock and Watson (2002),

Sensier and van Dijk (2004) and Kim et al. (2004). A structural explanation of this

decrease in volatility is still a matter of discussion. In monetary economics, the decrease

in U.S. unemployment volatility can be associated with a change in monetary policy.

Warne and Vredin (2006) find support for this explanation in a bivariate Structural

Vector Autoregressive (SVAR) model for inflation and unemployment, where volatility

breaks are captured by endogenous two-regime Markov-Switching. Primiceri (2005)

captures the changing variance in a stochastic volatility specification of a trivariate

SVAR, also including an interest rate. Sims and Zha (2006) analyse a SVAR with six

variables switching between nine regimes. All these authors use seasonally adjusted data

and find significant changes in U.S. unemployment volatility. As the forecast standard

Page 39: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 31

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

−2.5

0.0

2.5

5.0Innovation PUC

0 5 10 15 20 25 30 35 40 45

−0.25

0.00

0.25

0.50ACF−Innovation PUC

Figure 2.7: Time series plot of scaled one-step-ahead prediction errors (top panel) and sample

ACF for the the PUC model (bottom panel). Note the heteroskedasticity in the residuals.

error has decreased by 50% this is a crucial issue for realistic interval forecasting at the

end of our sample.

We take the approach of Sensier and van Dijk (2004) by allowing for breakpoints

in the stochastic process governing U.S. unemployment, starting with one breakpoint.

By estimating the parameters for two sub-samples, we found that only by varying the

σκ’s parameters (the shocks of the cycle component), large increases in the likelihood

are obtained. Formally, we extend the model for the variances of the cycle component

as σ2κ,τ,s, adding an extra variance factor τ , with

τ =

{I, for t in the period from 1948.1 to h.12

II, for t in the period from (h+ 1).1 to 2005.12

where h is the breakpoint year. The largest increase in the likelihood was found by

having the break in the early 1980s, which is in line with results in the literature.

The differences in the likelihoods for different breakpoints in the early 1980s are small.

We decided to fix the breakpoint in 1982. Warne and Vredin (2006) found additional

variance switches before 1982, but for the purpose of this chapter we found one variance

break to be satisfactory.

We emphasise that while the cyclical shock volatilities σκ,s are distinctly different

Page 40: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

32 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

before and after 1982, the other variances in the model are similar for the two eras.

On average, σκ was 0.06 in 1948-1981 and 0.03 after 1982. Between the individual

σκ’s for the different periods, the values of σκ’s can differ by a factor 3. These results

agree with those obtained by Kim et al. (2004), who compared univariate UC models

with structural variance breaks and Markov Switching. They showed that “the growth

rate of aggregate real GDP has been less volatile since the early 1980’s, and that this

volatility reduction is concentrated in the cyclical component of real GDP”. As the

variance in unemployment is closely related to the variance in GDP growth, it is not

surprising to find a similar phenomenon in the unemployment series.

Table 2.3: Estimation results for models with variance moderation in 1982

UC PUC

logL 1131.04 1188.64

AIC -2250.08 -2277.28

AICc -2249.94 -2269.21

BIC -2222.81 -2050.01

LR 115.20∗

N 0.06 1.20

Q(96) 131.54∗ 119.58∗

Q(144) 189.01∗ 173.29∗

p 6 50†

n∗S 696 696

Notes: Data sample used for estimation purpose is from January 1948 until December 2005. logL =

log-likelihood. AIC = Akaike Information Criterion. AICc = AIC with finite sample correction. BIC

= Bayesian Information Criterion. LR is LR test of the non-periodic model against its periodic coun-

terpart, N is a normality test on the prediction errors; Na∼ χ2(2). Q is a Portmanteau autocorrelation

test; Q(l)a∼ χ2(l − p) where l is the number of lags and p is the number of parameters. ∗ : the test

rejects at a 5% significance level. † : the number of parameters is calculated as 49 + 12 (cycle variance

moderation parameters σκ,s,1982) − 11 (zeros in σζ,s). n∗S is the number data points.

Table 2.3 reports the log-likelihood values, Akaike information criteria (AIC), AIC

with finite sample correction (AICc), Bayes information criteria (BIC) and tests on the

one-step-ahead prediction errors for the models with the variance moderation. These

estimation results are for models with variance moderation. Following Brockwell and

Davis (1993), the AIC and AICc statistics are defined as

AIC = −2 logL+ 2p,

and

AICc = −2 logL+ (2np)/(n− p− 1).

Page 41: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 33

1960

1980

2000

−2.

5

0.0

2.5

Inno

vatio

n U

C

−5.

0−

2.5

0.0

2.5

5.0

0.1

0.2

0.3

0.4

0.5

020

40

−0.

25

0.00

0.25

0.50

AC

F (

from

194

8 −

...)

020

40

−0.

25

0.00

0.25

0.50

AC

F (

from

198

2 −

...)

1960

1980

2000

−202

Inno

vatio

n P

_UC

−5.

0−

2.5

0.0

2.5

5.0

0.1

0.2

0.3

0.4

0.5

020

40

−0.

25

0.00

0.25

0.50

AC

F (

from

194

8 −

...)

020

40

−0.

25

0.00

0.25

0.50

AC

F (

from

198

2 −

...)

Fig

ure

2.8:

Gra

ph

ical

dia

gn

osti

csfo

rm

od

els

rep

orte

din

Tab

le2.

2.F

rom

left

tori

ght:

Sta

nd

ard

ised

one-

step

-ah

ead

pre

dic

tion

erro

rsti

me

seri

esp

lot,

his

togra

man

dn

on

-para

met

ric

den

sity

esti

mat

e,sa

mp

leA

CF

for

the

wh

ole

sam

ple

(fro

m19

48−

2005

)an

dsa

mp

leA

CF

for

the

seco

nd

half

(fro

m1982−

200

5)

of

the

seri

es.

Fir

stro

w:

non

-per

iod

icU

Cm

od

el,

Sec

ond

row

:P

UC

mod

el.

Page 42: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

34 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

By introducing a breakpoint in 1982, heteroskedasticity and non-normality of the one-

step prediction errors have largely disappeared. The log-likelihoods show a dramatic

improvement because of the variance moderation. For example, in the PUC model we

have an increase in the log-likelihood of 72, from 1116 to 1188, by introducing twelve

extra cycle variance moderation parameters. This increase is highly significant.

Comparing the periodic and non-periodic UC models’ AIC and AICc, the periodic

model performs best. The non-periodic model is preferred according to the BIC, as

the BIC implies a heavier penalty on the number of free parameters than the AIC and

AICc. The LR-test is convincingly in favour of the periodic model. The test value is

115.20 which is much higher than the 5% critical value of a χ244 distribution.

Figure 2.8 presents graphical diagnostics for the whole sample and for the second

part of the sample. For forecasting we are primarily interested in the adequacy of the

model in the second part of the sample. The diagnostics are now satisfactory. Some

serial correlation in the residuals remains, but the cyclical nature of the ACF for the

PUC forecast errors has disappeared after 1982, see the last row and last column of

Figure 2.8.

Figure 2.9 shows the changes in the cyclical variances. The most remarkable changes

occur in the last three months of the year which were the most volatile months before

1982, turning into relatively tranquil months in the second part of the sample.

2.4.3 Forecasting weights and forecasting performance

In the previous subsections we have shown the effects of periodicity on the time series

decomposition of U.S. unemployment. This subsection concerns the effect on forecast-

ing. In order to interpret the relative forecasting performance of different models, it is

useful to compute the forecasting functions implied by the different specifications. In

particular, it is important to see how the weights of past observations in the forecast

functions differ from month to month.

Periodic observation weights for h-step ahead forecasting are defined similarly as for

non-periodic models, see Harvey and Koopman (2000), only now different weights apply

for different months of the year. Consider the optimal linear one-step ahead forecasting

function for the actual series

yt|t−1 = E[yt|y1, . . . , yt−1] = w1,s,tyt−1 + w2,s,tyt−2 + . . .+ wt−1,s,t. (2.34)

In this notation we have made explicit that the forecasting function with s = 1 applies

when yt falls in month 1, the forecasting function with s = 2 applies when yt falls in

month 2, etc. The weights wi,s,t are simply the coefficients of these forecasting functions.

The exact forecasting function also depends on t, in contrast to forecasting functions

for pure periodic autoregressive models.

Page 43: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 35

2 4 6 8 10 12

0.9

1.1ρ (without variance moderation)

2 4 6 8 10 12

0.9

1.1ρ (with variance moderation)

2 4 6 8 10 120.00

0.05

0.10 σκ (without variance moderation)

2 4 6 8 10 120.00

0.05

0.10 σκ (1948−1981)

2 4 6 8 10 120.00

0.05

0.10 σκ (1982−2005)

Figure 2.9: Periodic parameter estimates ± 2 standard errors for the PUC models without

and with variance moderation. Top: ρs for model without (left) and with (right) variance

moderation. Middle: σκ,s for model without variance moderation. Bottom: σκ,I,s and σκ,II,s

for respectively first and second part of the sample in model with variance moderation. See

also Figure 2.5.

Page 44: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

36 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

−50 −40 −30 −20 −10 0

−0.5

0.0

0.5

1.0y_hat month 3

−50 −40 −30 −20 −10 0

−0.5

0.0

0.5

1.0y_hat month 6

−50 −40 −30 −20 −10 0

−0.5

0.0

0.5

1.0y_hat month 9

−50 −40 −30 −20 −10 0

−0.5

0.0

0.5

1.0y_hat month 12

Figure 2.10: Observation weights of yt−i,−i = −50 . . . ,−1 for one-step ahead prediction of

the observation (yt) for the months March, June, September and December 2006.

The forecasting weights sum to unity in all models that we discuss in this chapter:∑t−1i=1 wi,s,t = 1. The forecast weight functions of non-periodic models are very similar

across different months that are close to each other, especially towards the end of the

sample. For periodic models the weight functions are similar for observations exactly

one year apart, but the weight functions can be quite different for different months

in the same year. The forecasting weights are easy to compute for linear models in

state space form. We use the efficient algorithms derived by Koopman and Harvey

(2003). Similar periodic weight functions can be defined for filtered or smoothed (linear

combinations of) trend, cycle and seasonal components of yt, but we do not present

them in here to save space.

Figure 2.10 shows the weighting patterns for one-step ahead prediction of the obser-

vation y2006.s for the months March, June, September and December 2006, s = 3, 6, 9, 12.

We plot wi,s,t against −i, −i = −49, . . . ,−1, for s = 3, 6, 9, 12. The plots present the

wi in reverse order as the weight with the smallest index i corresponds to the most

recent observation. As expected, the plots show that the prediction of month t de-

pends heavily on the observation of month t − 1 which is depicted as the last bar of

each weight function. Scanning the graph from right to left, we see that the last bar

is preceded by small negative weights for the months t − 2 until t − 11. The second

Page 45: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.4. APPLICATION TO U.S. UNEMPLOYMENT DATA 37

large and positive bar presents the weight of month t − 12 from the year before, as

is common in seasonal time series models. The first large negative weight occurs for

t − 13, which is common for seasonal time series models with trends as it reflects the

difference operator. A yearly pattern then repeats itself and damps out. These weight

patterns clearly indicate the relevance of the periodic analysis for forecasting, as they

are quite distinct for the different months of the year.

2005 2006 20078.6

8.8

9.0

9.2log(Unemployment) Forecast UC

2005 2006 20078.6

8.8

9.0

9.2log(Unemployment) Forecast P_UC

Figure 2.11: Log U.S. unemployment series and one-step-ahead to twelve-step-ahead forecasts

with one-standard-error bands for 2006 using non-periodic (top panel) and periodic (bottom

panel) models, see also Table 2.3.

Figure 2.11 shows the 1 to 12-months ahead forecasts for 2006.1− 2006.12 together

with their 67% confidence interval for the models of Table 2.3 and compares them

with the actual unemployment values. The top row shows the forecasts of the non-

periodic model while the bottom row depicts the predictions of the periodic model.

The standard errors of the periodic models are generally smaller than the non-periodic

ones, as reflected by the tighter confidence bounds. The non-periodic model overpredicts

the unemployment series in all months of 2006 and the forecasts of the non-periodic

model are distinctly inferior to the forecasts of its periodic counterpart.

We previously mentioned that the periodic slope variance is estimated at zero except

for October. This might capture the structural change in the unemployment trend due

to new production lines in the auto industry as mentioned by Krane and Wascher

Page 46: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

38 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

(1999). In Figure 2.11, it appears that the drop in October 2006 was more accurately

predicted by the PUC model.

2.5 Summary and conclusion

The primary aim of this study is to present a comprehensive periodic time series analy-

sis based on the univariate UC model with four periodic stochastic components, namely

a stochastic trend, a time-varying seasonal, a stochastic cycle and an irregular. Esti-

mation of the unknown parameters is feasible, also for monthly data. Estimates are

obtained by exact maximum likelihood with diffuse initialisation of non-stationary com-

ponents in the model.

In our application to the monthly postwar U.S. unemployment series, we discover

periodicity in all parameters, especially in the cyclical component. Periodic models fit

the data better than the non-periodic ones in terms of log-likelihood and out-of-sample

forecasts. As the variances of cyclical unemployment decreased markedly in the second

part of our sample we successfully incorporate a variance moderation into our model.

Both the level and the periodic pattern of the cyclical variances changed over time.

Our trend estimate of U.S. unemployment series shows a clear periodic pattern with

the largest conditional variance in October of each year. The observation weights for

forecasting also show some marked differences in the forecasting functions for different

months in the same year.

We conclude that when there are clearly periodic autocorrelations in a time series,

PUC models outperform their restricted non-periodic counterparts. We demonstrate

the performance of PUC model in terms of the RMSE of parameter estimates of a styl-

ized stochastic cycle model and we show increased forecasting precision in a detailed

empirical analysis of monthly aggregate U.S. unemployment series. As expected, out-

performance depends on the actual periodicity of the observed process. Model selection

criteria and likelihood ratio tests can indicate whether the periodic extension of UC

models is worth the effort. After fully periodic estimates are produced, one can often

reduce the number of parameters to obtain a useful partially periodic model.

2.6 Appendices

2.A State space formulations

The state space form provides a unified representation of a wide range of linear Gaussian

time series models including ARMA and UC time series models; see, for example,

Harvey (1989), Kitagawa and Gersch (1996) and Shumway and Stoffer (2000). The

Page 47: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.6. APPENDICES 39

Gaussian state space form consists of a transition equation (2.35) for the m × 1 state

vector αt and a measurement equation (2.36) for the N × 1 observation vector yt for

t = 1, . . . , n. We formulate the model as in de Jong (1991), that is

αt+1 = Ttαt +Htεt, α1 ∼ N (a, P ) , t = 1, . . . , n, (2.35)

yt = Ztαt +Gtεt, εt ∼ NID(0, I), (2.36)

where εt is an independent sequence of standard normally distributed random vectors.

The state vector αt contains the unobserved components and their associated vari-

ables. The transition equation (2.35) is used to formulate the dynamic processes of

the unobserved components. The measurement equation (2.36) relates the observation

yt to the state vector αt through the signal Ztαt. The matrices Tt, Ht, Zt and Gt are

referred to as the state space system matrices and possibly time-varying. Specific ele-

ments of the system matrices may be specified as functions of an unknown parameter

vector. The initial state vector is α1 with mean vector a and variance matrix P . Model

(2.35)-(2.36) is linear and driven by Gaussian disturbances. Therefore, the state space

model can be treated by standard time series methods based on the Kalman filter; see,

for example, Anderson and Moore (1979) and Durbin and Koopman (2001).

The variance matrix P of the initial state vector α1 may contain diffuse elements

when non-stationary components are included in αt. In this case, diffuse initialisation

methods for the Kalman filter exist to evaluate the exact or diffuse likelihood func-

tion, see Ansley and Kohn (1985), de Jong (1991), and Koopman (1997). The diffuse

likelihood function is optimized to obtain exact maximum likelihood estimates of the

parameters. The suite of state space routines SsfPack 3.0 of Koopman et al. (2008) offers

a numerically stable implementation of the necessary diffuse likelihood computations.

Furthermore, we employ the Broyden-Fletcher-Goldfarb-Shanno (BFGS) numerical op-

timisation algorithm of Fletcher (1987) to maximise the likelihood function with respect

to unknown parameters.

2.B Kalman filter

This appendix summaries the Kalman filter algorithms as described in Durbin and

Koopman (2001). Both the observations yt and the unobserved state vector αt are

Gaussian processes. The Kalman filter is a recursive algorithm that estimates the

mean and variance of αt conditional on y1, . . . , yt. Starting with a1, P1, the estimates

Page 48: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

40 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

are updated through

vt = yt − Ztat,Ft = ZtPtZ

′t +GtG

′t,

Kt = TtPtZ′tF−1t ,

at+1 = Ttat +Ktvt,

Pt+1 = TtPtT′t +HtH

′t −KtFtK

′t, t = 1, . . . , n

(2.37)

where at = E(αt|y1, . . . , yt) and Pt = Var(αt|y1, . . . , yt).

2.C Kalman smoother

This appendix summaries the Kalman smoother algorithms as described in Durbin and

Koopman (2001). The Kalman smoother provides an efficient recursion to calculate the

mean and variance of αt conditional on the entire sample y1, . . . , yn, and is given by

Lt = Tt −KtZt

rt−1 = Z ′tF−1t vt + L′trt,

Nt−1 = Z ′tF−1t Zt + L′tNtLt

αt = at + Ptrt−1,

Vt = Pt − PtNt−1Pt, t = n, . . . , 1

(2.38)

with rn = 0, Nn = 0, αt = E(αt|y1, . . . , yn) and Vt = Var(αt|y1, . . . , yn).

2.D Likelihood evaluation

In econometric applications, the state space model usually depends on unknown pa-

rameters in the matrices Tt, Zt, Ht, and Gt. The log-likelihood function for the state

space model (2.35)-(2.36) is given by

l = log p(y1, . . . , yn;ϕ) =n∑t=1

log p(yt|y1, . . . , yt−1;ϕ)

= −nN2

log(2π)− 1

2

n∑t=1

(log|Ft|+ v′tF

−1t vt

), (2.39)

where ϕ is the parameter vector of the statistical model under consideration. The

innovations vt and their variances Ft are computed via the Kalman filter for a given

vector ϕ. The computational details of the Kalman filter are discussed in Koopman

et al. (1999).

Page 49: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.6. APPENDICES 41

2.E Periodic BSM for S = 2

To investigate whether all parameters in the UC models are identified, we use systems of

moment conditions. The moment conditions are based on the autocovariance function

of the stationary form of the UC models. Consider model (2.6) with S = 2 where we

have 6 parameters to estimate, namely σε,1, σε,2, ση,1, ση,2, σω,1 and σω,2. The stationary

form of model (2.6) is based on annual differences ∆Syt = (1−LS)yt of the observations

yt. For S = 2, that is

∆2yt+i = ηt+i−2 + ηt+i−1 − ωt+i−2 + ωt+i−1 + εt+i − εt+i−2, (2.40)

for i = . . . ,−2,−1, 0, 1, 2, . . . and t = 1, S + 1, 2S + 1, . . . . It follows that E[∆2yt+i] =

0, ∀ i.The annual autocovariance function for t = 1, S + 1, 2S + 1, . . . is given by

Γ0 = E

[(∆2yt

∆2yt+1

)(∆2yt

∆2yt+1

)′]

=

(σ2η,1 + σ2

η,2 + σ2ω,1 + σ2

ω,2 + 2σ2ε,1 σ2

η,2 − σ2ω,2

σ2η,2 − σ2

ω,2 σ2η,1 + σ2

η,2 + σ2ω,1 + σ2

ω,2 + 2σ2ε,2

), (2.41)

Γ1 = E

[(∆2yt

∆2yt+1

)(∆2yt−2

∆2yt−1

)′]=

(−σ2

ε,1 σ2η,1 − σ2

ω,1

0 −σ2ε,2

), (2.42)

Γj = E

[(∆2yt

∆2yt+1

)(∆2yt−2j

∆2yt+1−2j

)′]= 0, for j = 2, 3, 4, . . . , (2.43)

which is equivalent to the autocovariance function of a vector moving average process

with one lag. Note that the autocovariance matrix of the first lag, Γ1, is not symmetric,

namely

Γ−1 = E

[(∆2yt

∆2yt+1

)(∆2yt+2

∆2yt+3

)′]=

(−σ2

ε,1 0

σ2η,1 − σ2

ω,1 −σ2ε,2

)= Γ′1. (2.44)

These moment expressions reconfirm that we do not have a standard multivariate local

level model for y∗t∗ .

To identify the parameters from the autocovariances, we need to solve a linear

system of moment equations. Asymptotically, the expressions for the autocovariances

determine the Gaussian likelihood that we use in estimation. If two instances of a

time series model with different parameters have the same autocovariance function

and therefore the same spectrum and the same moving average representation, the

parameters cannot be identified by the Gaussian ML estimator, see Brockwell and

Davis (1993, § 10.8) and Yao and Brockwell (2006).

Page 50: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

42 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

Rewriting expressions (2.41)-(2.43) we get the system of moment equations for model

(2.6)

1 1 1 1 2 0

0 1 0 −1 0 0

1 1 1 1 0 2

0 0 0 0 −1 0

1 0 −1 0 0 0

0 0 0 0 0 −1

σ2η,1

σ2η,2

σ2ω,1

σ2ω,2

σ2ε,1

σ2ε,2

=

Γ0(1, 1)

Γ0(1, 2)

Γ0(2, 2)

Γ1(1, 1)

Γ1(1, 2)

Γ1(2, 2)

, (2.45)

where Γj(i, k) indicates the element on row i and column k of matrix Γj. The system of

equations (2.45) clearly has multiple solutions as the the first matrix only has rank 5.

The null space of this matrix is spanned by the vector (1 −1 1 −1 0 0)′. Hence,

we have to impose a restriction on the first four parameters to obtain identification, for

example

σ2η,1 − σ2

η,2 + σ2ω,1 − σ2

ω,2 = 0. (2.46)

Note that the restriction is not unique, we can also impose

σ2η,1 = σ2

η,2 or σ2ω,1 = σ2

ω,2. (2.47)

This identification problem for S = 2 also applies to extended versions of the model,

for example when a periodic slope is added in equation (2.6). For S > 2 the reduced

rank problem does not occur any longer.

2.F Periodic BSM for S = 3

This appendix extends the analysis of equation (2.6) to S = 3. The stationary form of

the periodic BSM model in equation (2.6) for t = 1, S + 1, 2S + 1, . . . with S = 3 is

given by

∆3yt = ηt−3 + ηt−2 + ηt−1 − ωt−2 + ωt−1 + εt − εt−3, (2.48)

∆3yt+1 = ηt−2 + ηt−1 + ηt − ωt−1 + ωt + εt+1 − εt−2, (2.49)

∆3yt+2 = ηt−1 + ηt + ηt+1 − ωt + ωt+1 + εt+2 − εt−1, (2.50)

Page 51: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

2.6. APPENDICES 43

The autocovariance function for t = 1, S + 1, 2S + 1, . . . is given by

Γ0 = E

∆3yt

∆3yt+1

∆3yt+2

∆3yt

∆3yt+1

∆3yt+2

=

A σ2η,2 + σ2

η,3 − σ2ω,3 σ2

η,3

σ2η,2 + σ2

η,3 − σ2ω,3 B σ2

η,1 + σ2η,3 − σ2

ω,1

σ2η,3 σ2

η,1 + σ2η,3 − σ2

ω,1 C

,Γ1 = E

∆3yt

∆3yt+1

∆3yt+2

∆3yt−3

∆3yt−2

∆3yt−1

′ =

−σ2ε,1 σ2

η,1 σ2η,1 + σ2

η,2 − σ2ω,2

0 −σ2ε,2 σ2

η,2

0 0 −σ2ε,3

,

Γj = E

∆3yt

∆3yt+1

∆3yt+2

∆3yt−3j

∆3yt+1−3j

∆3yt+2−3j

′ = 0 for j ≥ 2,

where A =∑3

i=1 σ2η,i +

∑3j=2 σ

2ω,j + 2σ2

ε,1, B =∑3

i=1 σ2η,i +

∑j=1,3 σ

2ω,j + 2σ2

ε,2, and

C =∑3

i=1 σ2η,i +

∑2j=1 σ

2ω,j + 2σ2

ε,3.

Identifiability can be shown by solving the following system of equations

1 1 1 0 1 1 2 0 0

1 1 1 1 0 1 0 2 0

1 1 1 1 1 0 0 0 2

1 0 1 −1 0 0 0 0 0

1 1 0 0 −1 0 0 0 0

0 1 1 0 0 −1 0 0 0

1 0 0 0 0 0 0 0 0

0 1 0 0 0 0 0 0 0

0 0 1 0 0 0 0 0 0

0 0 0 0 0 0 −1 0 0

0 0 0 0 0 0 0 −1 0

0 0 0 0 0 0 0 0 −1

σ2η,1

σ2η,2

σ2η,3

σ2ω,1

σ2ω,2

σ2ω,3

σ2ε,1

σ2ε,2

σ2ε,3

=

Γ0(1, 1)

Γ0(2, 2)

Γ0(3, 3)

Γ0(3, 2)

Γ1(1, 3)

Γ0(2, 1)

Γ1(1, 2)

Γ1(2, 3)

Γ0(3, 1)

Γ1(1, 1)

Γ1(2, 2)

Γ1(3, 3)

,

where the vector on the right hand side consists of the different nonzero elements of

Γ0 and Γ1. The matrix on the left hand side has a full column rank resulting in

identifiability. The system is easily solved for σ2ε,s, σ

2ω,s and σ2

η,s, respectively.

2.G Periodic correlations

Denote ys,t∗ as the observation for period s and year t∗ such that yt ≡ ys,t∗ , where

t = (S − 1)t∗ + s for t = 1, . . . , n∗S, t∗ = 1, . . . , n∗ and s = 1, 2, . . . , S. Sample periodic

Page 52: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

44 CHAPTER 2. PERIODIC UNOBSERVED COMPONENTS (PUC) MODELS

correlations have been defined by, i.a., McLeod (1994). Consider the separate series in

{yt∗} where

{yt∗} =(y1,t∗ y2,t∗ · · · yS,t∗

)′(2.51)

and standardise the series ys,t∗ separately by subtracting the periodic means and by

dividing by the periodic standard deviations to get {yt∗}, where

{yt∗} =(y1,t∗ y2,t∗ · · · yS,t∗

)′(2.52)

for s = 1, . . . , S, t∗ = 1, 2, . . . , where t∗ is the index in years. If we associate y1,t∗ with

yt (series y at time t), y2,t∗ with yt+1 (series y at time t+ 1), . . . , and yS,t∗ with yt+S−1

(series y at time t+ S − 1), we can also view the standardised series as

{yt∗} =(yt yt+1 · · · yt+S−1

)′, for t = 1, S + 1, 2S + 1, . . . . (2.53)

Next, consider all the covariances between the standardised subseries, {yt∗}, and their

lags {yt∗−j}, where

{yt∗−j} =(y1,t∗−j y2,t∗−j · · · yS,t∗−j

)′=(yt−jS yt+1−jS · · · yt+S−1−jS

)′, for t = 1, S + 1, 2S + 1, . . . .

For S = 2 the periodic correlations γi,s, i = 1, 2, . . ., s = 1, . . . , S, are selected from the

multivariate correlation matrices of yt∗ as follows:

E

[(y1,t∗

y2,t∗

)(y1,t∗

y2,t∗

)′]=

(1 γ1,−1

γ2,1 1

)

E

[(y1,t∗

y2,t∗

)(y1,t∗−j

y2,t∗−j

)′]=

(γ1,2j γ1,2j+1

γ2,2j+1 γ2,2j

), j = 1, 2, . . . .

We compute the sample periodic correlations by noting that the sample correlations ci,s,

i = 1, 2, . . . , (n− 1)S, s = 1, . . . , S are associated with the population correlations γi,s

defined above for S = 2 and without degrees of freedom corrections. The computation

is therefore basic. For S > 2 the computation is analogous.

Page 53: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Chapter 3

Multivariate PUC models

3.1 Introduction

An unobserved components (UC) time series model is built of a number of stochastic

linear processes that typically capture trend, seasonal, cycle and remaining stationary

dynamic features in an observed time series. The basic theory and methodology is

laid out in Harvey (1989). Each stochastic component can be represented as an au-

toregressive integrated moving average (ARIMA) process. We can therefore consider

the UC model as a special case of the RegComponents framework of Bell (2004). The

typical parameters that need to be estimated in an UC model are the variances of the

innovations driving the components and a selection of other coefficients associated with

the ARIMA processes. For a seasonal time series, the UC model usually contains a

stochastic seasonal component that is able to capture the time-varying seasonal effects.

However, other dynamic features in the time series may also be subject to seasonal ef-

fects. For example, in case of an economic time series, cyclical effects may be relatively

more apparent or have a bigger impact in a particular season (winter) compared to

another season (summer). In such cases, we can let the coefficients associated with a

particular component be dependent on the season. We define this extension of the UC

model as a periodic UC (PUC) model and it is explored in Koopman and Ooms (2002,

2006) and in Chapter 2. In this chapter we extend the PUC approach into a multivari-

ate periodic unobserved components (MPUC) framework where several observed time

series are modelled simultaneously.

An early multivariate periodic analysis based on a multivariate UC model with a

stochastic cycle component is carried out by Krane and Wascher (1999). They anal-

ysed time series of U.S. quarterly employment data in nine different industrial sectors

including construction, motor vehicle manufacturing, durable and non-durable goods,

retail trade, government and mining. The sum of these sectors equals the total non-farm

employment numbers published by the Bureau of Labour Statistics (BLS). Sectoral em-

Page 54: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

46 CHAPTER 3. MULTIVARIATE PUC MODELS

ployment time series are key in tracking economic activity (recessions and expansions)

and therefore they are intensively analysed by policy decision makers. The motivation

for the analysis was to study whether coefficients associated with the economic cycle

component are subject to periodic variations. The study of Krane and Wascher (1999)

found significant evidence of periodic variations in the cycle coefficients for more than

half of the sectors and we take their results as one of the motivations for our empirical

study. Although our modelling framework is also based on a multivariate UC time series

model, there are differences between our frameworks. Krane and Wascher modelled the

time series in first differences while in our study we consider the time series in levels.

Our model is also more elaborate since we include more stochastic components and

more periodic coefficients. The extensions are considered with the aim to avoid model

misspecification that may lead to inaccurate conclusions about the periodic nature of

the dynamic features in the multiple time series.

In our study we analyse seven employment sectors in accordance with the North

American Industry Classification System (NAICS), established in 1997. This dataset

for U.S. sectoral employment from the BLS is documented clearly and can be obtained

freely. We specifically examine whether our main findings hold for different sample

selections in our quarterly dataset that ranges from 1950 until 2009 and includes past

and more recent economic recession periods. In other words, how robust are the findings

when different subsamples of the dataset are considered? This study also shows that

we are able to fit MPUC models simultaneously without any univariate pre-analyses

of the data. The general model includes trend, season, cycle and irregular components

with variance matrices that can have different values for different quarters and that can

have different values for different subsamples of the data. In this general set-up we can

allow for periodicity and for the Great Moderation that refers to a reduction of overall

volatility in economic time series after the early 1980s. We also formally test for each

component whether the additional parameters for periodicity and the Great Moderation

are statistically significant. In summary, our empirical study is motivated by Krane and

Wascher (1999), our considered multivariate periodic model is comprehensive and the

parameters are estimated simultaneously for all equations and all components.

An observed seasonal time series with a sample autocorrelation function that changes

with the season is referred to as a periodic time series. To enable the identification of

these dynamic characteristics in a time series, Gladysev (1961) and Tiao and Grupe

(1980) have formally defined periodic autocorrelations using a stationary vector rep-

resentation of the periodic univariate time series. Once periodic properties of a time

series are detected, the time series analyst can consider time series models that allow

for these periodic correlations. Periodic time series models are originated by Hannan

(1955) in the context of geophysics and environmental empirical studies. However,

Page 55: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.1. INTRODUCTION 47

periodic dynamic regression models for economic time series have been applied much

earlier, namely since the 1930’s; see the review in Mendershausen (1937). A range of

periodic analyses for environmental and economic time series have appeared ever since

with the classical references given by Wallis (1978), Plosser (1979), Ghysels (1988,

1991), Barsky and Miron (1989), and Canova and Ghysels (1994). More specifically,

Osborn and Smith (1989) introduced the periodic time series framework in dynamic

macroeconomic models. A model-based periodic time series analysis becomes effective

when appropriate methods and algorithms are developed for estimation and diagnostic

checking. For this purpose, Ghysels and Osborn (2001) and Franses and Paap (2004)

have discussed a wide spectrum of periodic models with a focus on their econometric

implications. Maximum likelihood estimation methods for periodic autoregressive mov-

ing average models have been discussed by Vecchia (1985), Li and Hui (1988), Jimenez

et al. (1989) and Lund and Basawa (2000) while Anderson and Meerschaert (2005)

have provided asymptotic theory for efficient moment based estimation. Furthermore,

many environmental and economic studies have given empirical evidence that time se-

ries models require periodically changing parameters; see, for example, Osborn (1988),

Osborn and Smith (1989), Bloomfield et al. (1994), Ghysels and Osborn (2001) and

Franses and Paap (2004).

Periodic extensions of the UC class of time series models have been explored in a

variety of studies. Proietti (2004) considered an UC model with the trend component

modelled as a weighted average of separate independent random walks for each season.

Penzer and Tripodis (2007) studied the seasonal component with a periodic variance.

Koopman and Ooms (2002, 2006) explored different periodic specifications of the UC

model. In the context of RegComponent UC models Bell (2004) considered the effect of

seasonal heteroskedasticity in the irregular component on seasonal adjustment, see also

Findley (2005). Different types of periodicity in a UC model imply different optimal

seasonal adjustment filters. It is of general interest to investigate how we can identify

different types of periodicity both in theory and in practice.

The remaining part of the chapter is organized as followed. Section 3.2 introduces

a general class of MPUC time series models. In an Appendix we provide more details

of how the models can be formulated in state space form. Section 3.3 provides the

details of our dataset of sectoral U.S. employment quarterly time series. We present

our results of an extensive empirical analysis of U.S. sectoral employment in Section 3.4.

The estimates of coefficients and of unobserved components (trend, seasonal and cycle)

are presented together with test statistics for hypotheses where coefficients are not

periodic under the null. Section 3.5 provides a detailed comparison of the results and

conclusions from a similar empirical study by Krane and Wascher (1999). Section 3.6

concludes.

Page 56: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

48 CHAPTER 3. MULTIVARIATE PUC MODELS

3.2 Multivariate periodic unobserved components

models

We define yt as the N×1 vector of quarterly time series observations which are modelled

by a multivariate unobserved components (MUC) time series model consisting of p× 1

vectors µt for trend, γt for seasonal, Ψt for cycle and εt for irregular. The resulting

MUC model can simply be represented by

yt = µt + γt + Ψt + εt, (3.1)

and we have n quarterly observations for yt, that is t = 1, . . . , n. The trend component

is specified as in a multivariate local linear trend model, that is

µt+1 = µt + βt + ηt, ηt ∼ NID(0,Ση,t), (3.2)

βt+1 = βt + ζt, ζt ∼ NID(0,Σζ,t), (3.3)

where βt represents the N × 1 vector of growth or slope components for the trend

vector µt. The normally distributed disturbances ηt and ζt drive the trend and slope

components. These disturbances are serially and mutually uncorrelated at all times,

have zero means and have time-varying variance matrices Ση,t and Σζ,t which are fixed

functions of time index t. In our study we have adopted this trend specification but

alternative formulations for the trend can also be considered. We should mention that

different combinations of values for the variances Ση,t and Σζ,t will also lead to different

dynamic properties, see the discussion in Harvey (1989, Chapter 2). For example, the

conditions Ση,t = Σζ,t = 0 imply the trend µt = µ1 + β1t, a vector of fixed linear trends

with constant µ1 and gradient β1, while the single condition Ση,t = 0 leads to a trend

component µt that is smoothly evolving through time.

The stochastic seasonal component vector γt can be specified in different ways. For

this study, we opt for the elementary time-varying dummy seasonal specification as

given by

S(L)γt+1 = ωt, ωt ∼ NID(0,Σω,t), (3.4)

where S(L) = 1 + L + L2 + L3 is the seasonal sum operator for quarterly time series.

The normally distributed disturbance ωt drives the changes in the seasonal effect over

time and is serially and mutually uncorrelated (with all other disturbances and for all

time periods). In the limiting case Σω,t = 0, the quarterly effects are fixed over time

and are specified as a set of unknown fixed dummy coefficients that sum up to zero.

The cycle component is common to all time series in yt and therefore we specify

Ψt = Θtψt, (3.5)

Page 57: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.2. MULTIVARIATE PERIODIC UNOBSERVED COMPONENTS MODELS 49

where Θt is an N × 1 vector of fixed and unknown coefficients that are deterministic

functions of time index t. The stationary autoregressive (AR) process ψt is modelled as(ψt+1

ψ∗t+1

)= ρt

(cosλ sinλ

− sinλ cosλ

)(ψt

ψ∗t

)+

(κt

κ∗t

),

(κt

κ∗t

)∼ NID

[(0

0

), σ2

κ,tI2

], (3.6)

with AR coefficient 0 < ρt < 1 and cycle frequency λ. The normally distributed

disturbances κt and κ∗t drive the autoregressive cyclical process. These disturbances

have the variance σ2κ,t as deterministic function of time index t. The parameters Θt and

σ2κ,t are not jointly identified because they both affect the scaling of the elements in Ψt.

The identification problem can be solved in different ways; we choose to restrict the

first element in Θt to be equal to unity. We therefore have

Θt = (1 , θ2,t , . . . , θN,t)′, (3.7)

where the element θi,t is treated as unknown for i = 2, . . . , N . The coefficients θ2t, . . . , θN,t,

ρt and σ2κ,t are time-varying as fixed functions of time index t.

Finally, the irregular component is given by

εt ∼ NID(0,Σε,t), (3.8)

where the variance matrix Σε,t can also be time-varying as fixed function of time. All

disturbances ηt, ζt, ωt, κt, κ∗t and εt are serially and mutually uncorrelated at all times

t and s, and at all lags.

We have introduced a general class of multivariate UC time series models where the

variance matrices and some key coefficients of the model are time-varying. Even when

we consider time-invariant versions of such models, the modelling framework is still of

a general nature. More detailed discussions on properties, estimation and testing are

provided in Harvey and Koopman (1997). A particular interesting feature is the notion

that the variance matrices associated with the components may be of a lower rank.

This implies that particular components can be common to all time series in yt. In

our cycle specification for Ψt, we have a limiting case where a single stationary process

(with cyclical properties) is common to all individual time series in the vector yt.

3.2.1 Periodic specification

Assume that ϑt is a particular time-varying coefficient of the MUC model introduced

before. The MPUC model is the multivariate UC model where the coefficient ϑt obtains

only different values for different seasonal periods. For quarterly time series, ϑt would

take four different values, one for each season. The four different values can be restricted

such that, for example, ϑt is different for one season while it has the same value for the

Page 58: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

50 CHAPTER 3. MULTIVARIATE PUC MODELS

other seasons. In all these settings, we refer to this specification as a periodic model

and effectively we have

ϑt ≡ ϑ(j), j = J(s, t), j ∈ {1, . . . , s}, (3.9)

for fixed coefficients ϑ(1), . . . , ϑ(s) where s is the seasonal length (for a quarterly time

series, s = 4) and the seasonal index function J(s, t) is typically given by

J(s, t) = 1 + (t− 1 mod s).

In the MPUC model, all time-varying coefficients will be specified as ϑt in (3.9) and

therefore are treated as periodic, except the cycle frequency λ in (3.6).

The number of coefficients in the MPUC model is high. The maximum number of

coefficients p for our MPUC model with number of seasons s and number of series N

is given by

p = cnp + scp + s(N − 1) +N

2(N + 1)s(q − 1),

where cnp and cp are the number of non-periodic and periodic coefficients in the cycle

component, respectively, that is cnp = 1 and cp = 2, and q is the number of variances

in the univariate version of the model, that is q = 5. Some examples for the number of

coefficients are given by

s N q p

2 2 5 31

12 2 5 181

4 7 5 481

In the last example, we require a bit more than 13 years of quarterly observations to

identify all parameters (7× 4× 13 = 364). In our empirical study, we have 60 years of

observations and therefore a sufficient number of degrees of freedom remain.

The number of coefficients in our MPUC model is large nevertheless. To enable

a feasible empirical analysis, we will simplify the MPUC model specification consider-

ably in the main study of this project. We consider the following restrictions on the

coefficients.

Firstly, we restrict the three variance matrices associated with the trend, seasonal

and irregular components to be diagonal. In this way, the number of coefficients reduces

to

p = cnp + s (cp − 1) +Ns q.

The consequence of these diagonal variance matrices is that the individual time series

in yt depend on idiosyncratic factors for trend, seasonal and irregular. If the common

cycle component is not present in the model, the individual time series can be analysed

Page 59: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.2. MULTIVARIATE PERIODIC UNOBSERVED COMPONENTS MODELS 51

separately by a univariate UC model. The presence of the common cycle component

in the model requires the simultaneous, multivariate analysis that we pursue in our

empirical study.

Secondly, we can take the trend component (3.2) as a smooth stochastic function of

time which can be established by restricting Ση,t = 0 for all t and Σζ,t diagonal. The

number of coefficients is reduced as a consequence since q reduces from 5 to 4. In our

periodic modelling framework, other options are also available to reduce the number

of coefficients. For example, in a particular case, we may consider only a different

coefficient value for the fourth quarter while for the other quarters we have another

coefficient value for the other three quarters. Such considerations are discussed in our

empirical study of Section 3.4.

3.2.2 Estimation, testing and signal extraction

The unknown periodic coefficients will be estimated by the method of maximum like-

lihood. The MPUC model can be represented in state space form. The details of the

MPUC state space form are given in the Appendix of this chapter. Once the model is

in state space from, we adopt the Kalman filter for the evaluation of the log-likelihood

function of the model; see Durbin and Koopman (2001) for a complete treatment of a

state space time series analysis. A quasi-Newton method is used for the numerical opti-

mization of the log-likelihood function with respect to the coefficients; see for example

Fletcher (1987).

In an empirical study, it is not likely that all coefficients in our MPUC model are pe-

riodic. We aim to find the appropriate periodic model specification via likelihood-based

hypothesis tests. We will test the null hypothesis of “no periodicity” for individual

coefficients and for groups of coefficients using tests such as the likelihood ratio, La-

grange multiplier and Wald. For the single periodic coefficient ϑt in (3.9), the null and

alternative hypotheses are given by

H0 : ϑ(1) = ϑ(2) = . . . = ϑ(s), H1 : ϑ(1) 6= ϑ(2) 6= . . . 6= ϑ(s), (3.10)

where we have s−1 restrictions under the null. The likelihood-based test statistics have

an asymptotic χ2 distribution with s− 1 degrees of freedom. The likelihood functions

of the models under the null and under the alternative hypotheses are properly defined.

In case of joint hypotheses involving variance matrix coefficients, we assume that the

variance matrix involved has full rank or at least has the same rank under the null

and the alternative hypotheses. Furthermore, all variances are strictly positive. We

prefer to adopt the general-to-specific strategy in the process of finding the appropriate

periodic model specification.

Page 60: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

52 CHAPTER 3. MULTIVARIATE PUC MODELS

Once a correct model specification is determined, we may want to visually inspect

estimates of the trend, seasonal and cycle components, to gain further insights into the

dynamic properties of the time series. We obtain the component estimates at each time

t recursively via signal extraction methods that are typically carried out by the Kalman

filter and the associated smoothing algorithms in a model-based state space time series

analysis. The component estimates have optimal statistical properties. For example,

under linear Gaussian conditions, the estimates are minimum mean square estimates;

see Durbin and Koopman (2001, Chapter 4) for a discussion on Kalman filtering and

smoothing methods. Further applications of signal extraction is detrending and seasonal

adjustment of the time series.

All necessary computations for our empirical study are carried out by SsfPack 3.0

using the programming environment Ox; see Koopman et al. (1999, 2008) and Doornik

(2009).

3.3 Data description: U.S. employment series

Our empirical analysis of Section 3.4 is based on the U.S. non-farm payroll employment

database of the Bureau of Labour Statistics (BLS). The database consists of monthly

observations on employment, hours, and earnings. The payroll employment data mea-

sure the number of employees in non-farm sectors. The survey includes about 140, 000

businesses and government agencies, which cover approximately 440, 000 individual

work sites that are drawn from a sampling frame of roughly 9 million unemployment

insurance tax accounts. The active current employment statistics sample frame includes

approximately one-third of all non-farm payroll workers∗. From the internal BLS web-

site†, we have collected monthly U.S. non-farm payroll employment time series together

with corresponding sub-series for different sectors of the U.S. economy. Many time

series are available from January 1940 onwards. However, several sub-series start later,

some of these are even not available earlier than January 1990.

The basic structure of the non-farm employment time series is given by

Total non-farm = goods production + private service-providing + government.

The goods production employment consists of sectors durable (e.g woods, metals, elec-

tronics and cars) and non-durable (e.g food, textile, paper products and plastics) goods

manufacturing, construction businesses (related to building e.g houses, offices, bridges

and roads), and natural resources exploitation businesses (e.g logging, mining, oil and

gas). The private service-providing sector has the largest number of employees. This

∗The historical statistics are available at http://www.bls.gov/ces/†Data source http://www.bls.gov/webapps/legacy/cesbtab1.htm

Page 61: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.3. DATA DESCRIPTION: U.S. EMPLOYMENT SERIES 53

sector includes trade, transport, utility, publishing, entertainment, financial, legal, ed-

ucation, health, leisure and maintenance industries and many non-profit organisations.

The government sector consists of federal, state and local employment. With more

than 1.8 million civilian employees, the Federal Government (excluding the U.S. Postal

Service) is the largest single employer in the U.S. which includes Army, Navy, Air Force,

NASA, Census Bureau, Homeland Security, Social Security, Justice and Treasury. The

state and local governments also include educational workers for state universities and

public schools.

For our empirical study, we have selected seven sub-sectors of employees on non-farm

payrolls which we refer to as:

D durable goods manufacturing

ND non-durable goods manufacturing

NR natural resources

C construction businesses

TTU trade, transport and utility industries

OS other service-providing industries

G all government workers

(3.11)

These seven sectors add up to total non-farm employment. In the related study of Krane

and Wascher (1999), nine sectors are selected. First, they have separated the sector

motor vehicles from sector D. From the BLS website motor vehicle employment is not

available for the months before January 1990 and therefore we exclude this particular

sector. Second, we have pooled all government related employment together into the

one sector G. The original time series are provided by BLS in a monthly frequency. We

follow Krane and Wascher (1999) and have cumulated the monthly series into quarterly

series by taking the average of the three months in each quarter.

Figure 2.2 presents our resulting seven time series after the employment totals are

transformed into logarithms. All series start in 1950, quarter 1, and finish in 2009,

quarter 4, which includes the first two years of the most recent financial crisis. When we

take a look at the data, the historical employment series are clearly upward trending.

The employment sectors D and ND show declines after the 1990s. The NR sector

appears to have a different trend behaviour. Cyclical behaviour in the time series is

most pronounced for the D and C sectors. The study of Krane and Wascher (1999) was

based on nine quarterly employment series that span from 1953 until 1989. It will be

interesting how the results and conclusions have changed due to analyses that consider

the more recent sample.

Page 62: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

54 CHAPTER 3. MULTIVARIATE PUC MODELS

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010

6.5

7.0

7.5

8.0

8.5

9.0

9.5

10.0

10.5

11.0 OS

TTUG

D

NDC

NR

Figure 3.1: Quarterly U.S. non-farm employment sub-series in level and in logarithms from

1950:Q1 to 2009:Q4. Source: U.S. Bureau of Labour Statistics. Seven series of log employ-

ment: D = durable goods manufacturing; ND = non-durable goods manufacturing; NR =

natural resources industries; C = construction businesses; TTU = trade/transport/utility

industries; OS = other service providing industries; G = aggregated government sector.

Page 63: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 55

3.4 Empirical results for U.S. employment sectors

The seven quarterly employment series introduced in Section 3.3 are analysed using

the MPUC time series model of Section 3.2. In particular, we consider model (3.1) for

which

• the trend µt in (3.2) is modelled as a smooth process with Ση,t = 0 and individual

trends are idiosyncratic, that is, Σζ,t is diagonal;

• the seasonal component in (3.4) and the irregular in (3.8) are also idiosyncratic

and therefore their corresponding variance matrices Σω,t and Σε,t are diagonal;

• the common cycle component is given by (3.6) and its loading factors are given

by (3.7);

• each time-varying periodic coefficient is specified as in (3.9).

Since we have N = 7 and s = 4, it follows that for this periodic model specification we

have 117 coefficients to estimate from a total of 1, 680 quarterly observations (60 years

of data).

In addition to this model specification, we also take account of the Great Modera-

tion in the business cycle dynamics which is well documented in the empirical macroe-

conomic literature; see, for example, McConnell and Perez-Quiros (2000), Stock and

Watson (2002) and Sensier and van Dijk (2004). To account for the Great Moderation

of the business cycle, we let the cycle variance σ2κ,t in (3.6) have different values before

1983 and another value from 1983:Q1 onwards. We refer to this model with a break in

the cycle variance as the “single switch in cycle variance”. The financial crisis may have

stopped the Great Moderation period and therefore we also consider models where the

cycle variance from 2008 onwards returns to its value of the one before 1983. We refer to

this model as the “two switches in cycle variance” and we notice that it does not require

additional coefficients. The introduction of new values for the periodic cycle variance

after 2007 may be desirable but the number of observations after 2007 is limited. The

estimation of new cycle variances for a period as short as 8 quarters may not lead to

reliable results. The cycle variance with two switches is therefore specified as

σ2κ,t =

{σ2κ,I,s for t in the years from 1950 up to 1982 and 2008 up to 2009;

σ2κ,II,s for t in the years from 1983 up to 2007.

for s = 1, . . . , 4. It follows that 1983:Q1 and 2008:Q1 are the break points.

Table 3.1 presents the maximum likelihood estimates of the cycle variance in the

MPUC model with periodic coefficients for all components. The results are presented

for the model without cycle variance moderation and with moderation (single or two

Page 64: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

56 CHAPTER 3. MULTIVARIATE PUC MODELS

switches). The number of coefficients are reported together with the log-likelihood

value at maximum likelihood parameter estimates. The associated information criteria

of Akaike and Schwartz’ Bayesian are also presented in Table 3.1. All results indicate

that we prefer to adopt the model that accounts for the Great Moderation between

1983 and 2007. In all models considered below we have included the cycle variance

moderation with two switches.

Table 3.1: Estimation results for full periodic models, with and without the Great

Moderation

No switch Single switch Two switches

cycle variance σ2κ,I,1 0.0112 0.0121 0.0116

σ2κ,I,2 0.0079 0.0101 0.0098

σ2κ,I,3 0.0128 0.0165 0.0173

σ2κ,I,4 0.0146 0.0158 0.0167

moderation σ2κ,II,1 – 0.0052 0.0028

σ2κ,II,2 – 0.0033 0.0027

σ2κ,II,3 – 0.0040 0.0038

σ2κ,II,4 – 0.0096 0.0027

Estimated log-likelihood value 5678.69 5702.84 5722.22

Akaike information (AIC) -11123.38 -11163.68 -11202.44

Akaike corrected (AICc) -11105.70 -11144.73 -11183.49

Schwartz Bayesian (BIC) -10488.47 -10507.07 -10545.83

Number of coefficients 117 121 121

Notes: These results are for our MPUC time series model. No switch = periodic cycle variances σ2κ,t

are the same for the full sample; Single switch = periodic cycle variances change after 1982; Two

switches = periodic cycle variances are different after 1982 and before 2008. Number of observations

is 1680 (60 years). The value in bold signals the preferred value.

3.4.1 Model specification within the class of MPUC models

Next we investigate whether all coefficients require a periodic specification in our MPUC

model. In most cases, we do not consider individual coefficients but we concentrate on

groups of coefficients. For example, we investigate whether the 7× 7 diagonal variance

matrix Σω,t for the seasonal component is periodic (28 coefficients) or not (7 coefficients).

We formulate a joint null hypothesis for a group of coefficients, similar to (3.10) for an

individual coefficient. The likelihood-ratio statistic is used to test the hypothesis. In

addition, we compute Akaike and Bayesian information criteria to decide the final model

Page 65: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 57

selection. We pursue a general to specific approach of finding the appropriate model

specification. The number of combinations of periodic and non-periodic coefficients is

large. We therefore have limited ourselves to the models that are reported in Table

3.2. In our considered models, the common cycle is associated with the employment

time series for durable goods manufacturing (D) since the first loading in Θt in (3.6) is

restricted to one and the order of the series in the 7× 1 vector yt is given in (3.11); the

first series in yt is D.

The estimation results for our range of models is presented in Table 3.2. We have

been able to estimate the full periodic Model 1 with 121 coefficients, the resulting

loglikelihood value is reported. Since the estimated irregular variances in the diagonal

matrix Σε,t are the smallest estimated variances in Model 1 and they are not very

different for the four quarters, we have imposed the necessary restrictions for the non-

periodic hypothesis H0 : Σ(1)ε = · · · = Σ

(4)ε and have labelled it as Model 2. The

number of coefficients in Model 2 reduces from 121 to 100. The maximum likelihood

procedure is repeated successfully and we obtained a slightly lower loglikelihood value.

The p-value of the corresponding χ2 likelihood ratio test with 21 degrees of freedom

is 0.123, which indicates that the drop in loglikelihood is not significant and that the

null hypothesis of equal variances in the irregular component can not be rejected. The

information criteria confirm our preference for Model 2 over Model 1.

In Table 3.3 we present the estimated coefficients related to the common cycle

component in Model 2 together with the estimated standard errors in parentheses. We

also present the likelihood ratio tests for the individual coefficient hypothesis (3.10)

and their associated p-values. The persistence of the cycle component is measured by

the discounting factor ρ and we find that its estimate is high for all quarters. The

corresponding p-value suggests that the different ρ values are insignificantly different

and therefore not periodic. The estimated cycle loadings for the different sectors all have

positive values except those for the Government sector (G). The cycle has the highest

impact on D but is also strongly present in the sectors NR, C and ND. The Government

employment appears to be anti-cyclical (negative loadings). However, the loadings are

not significantly different from zero except for quarter 4. Furthermore, the likelihood

ratio test indicates that the estimated loadings for G are not significantly different for

different quarters. From these results we may conclude that during a recession period

the number of available jobs in the private sectors decreases while employment increases

somewhat in Government, and vice versa. We can also conclude that no evidence is

found for periodic coefficients related to the business cycle dynamics of the time series.

We have learned from Table 3.3 that all cycle coefficients are individually not peri-

odic. To investigate whether the cycle coefficients are jointly periodic, we have included

the estimation results for Models 3, 4, 5 and 6 in Table 3.2. In comparison with Model 2,

Page 66: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

58 CHAPTER 3. MULTIVARIATE PUC MODELS

Tab

le3.2:

Estim

ationresu

ltsfor

MP

UC

model

with

two

switch

es:gen

eralto

specifi

c

Model

1M

odel

2M

odel

3M

odel

4M

odel

5M

odel

6M

odel

7aM

odel

7bM

odel

8

Tren

dσ2ζ,t

PP

PP

PP

PN

PN

P

Season

alσ2ω,t

PP

PP

PP

NP

PN

P

Irregular

σ2ε,t

PN

PN

PN

PN

PN

PN

PN

PN

P

Cycle

σ2κ,I

PP

PP

NP

NP

NP

NP

NP

Moderation

σ2κ,II

PP

PP

PN

PN

PN

PN

P

Load

ings

Θt

PP

NP

NP

NP

NP

NP

NP

NP

Discou

ntin

gρt

PP

PN

PN

PN

PN

PN

PN

P

Freq

uen

cyλ

NP

NP

NP

NP

NP

NP

NP

NP

NP

Num

ber

ofco

effs

121100

8279

7673

5555

31

log-likelihood

5722.225707.9

5696.015694.01

5691.185691.13

5662.615666.19

5636.58

Akaike

criterion-11202.44

-11215.80-11228.02

-11230.02-11230.36

-11236.2

6-11221.22

-11228.38-11211.16

Akaike

corrected-11183.49

-11203.01-11219.50

-11222.12-11223.06

-11229.5

3-11217.83

-11224.99-11209.96

Sch

wartz’

criterion-10545.83

-10673.15-10783.04

-10801.32-10817.94

-10840.12-10939.04

-10946.20-1

1042.9

4

Notes:

Th

isT

ab

lep

resents

estimation

results

for

aselectio

nof

MP

UC

mod

elsfo

rw

hich

aselection

ofco

efficien

tsis

perio

dic:

P=

perio

dic

coeffi

cients;

NP

=n

otp

eriod

icco

efficien

ts.T

he

estimation

sam

ple

isfro

m1950

to2009

an

dco

nta

ins

1680

qu

arterlyob

servations.

Page 67: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 59

Table 3.3: Estimation results for the cycle coefficients of Model 2 together with their

standard errors in parentheses.

Q1 Q2 Q3 Q4 χ2 test p-value

ρ 0.9431 0.9517 0.9387 0.9976 4.34 0.227

(0.0314) (0.0333) (0.0384) (0.0375)

θD 1 1 1 1 – –

– – – –

θND 0.4120 0.4057 0.3864 0.3787 5.88 0.118

(0.0294) (0.0285) (0.0294) (0.0277)

θNR 0.5495 0.5866 0.6381 0.5520 4.24 0.237

(0.0976) (0.1080) (0.0942) (0.1007)

θC 0.5290 0.5787 0.5504 0.5003 4.26 0.235

(0.0740) (0.0676) (0.0672) (0.0675)

θTTU 0.2834 0.2749 0.2671 0.2724 2.92 0.404

(0.0173) (0.0167) (0.0166) (0.0183)

θOS 0.2308 0.2248 0.2119 0.2197 6.14 0.105

(0.0139) (0.0131) (0.0141) (0.0133)

θG -0.0432 -0.0261 -0.0265 -0.0576 5.94 0.115

(0.0210) (0.0223) (0.0200) (0.0227)

σκ,I 0.0114 0.0101 0.0164 0.0167 5.50 0.139

(0.0019) (0.0018) (0.0028) (0.0026)

σκ,II 0.0028 0.0031 0.0036 0.0024 0.62 0.892

(0.0011) (0.0012) (0.0012 ) (0.0014)

Period in years 4.9087

(0.2442)

Notes: The null hypotheses for the reported χ2 tests are given by H0 : ρ(1) = · · · = ρ(4) and H0 :

Θ(1)j = · · · = Θ

(4)j for j = 2, . . . , 7. The reported p-values correspond to the 5% significance level.

Page 68: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

60 CHAPTER 3. MULTIVARIATE PUC MODELS

Model 3 has no periodic loading coefficients for the cycle; in comparison with Model 3,

Model 4 has no periodic discounting factor ρ; in comparison with Model 4, Model 5

has no periodic cycle variance; in comparison with Model 5, Model 6 has no periodic

cycle moderation. In effect, Model 6 has only periodic coefficients for the trend and

seasonal variances with a total of 73 coefficients. When we track the loglikelihood

values of the subsequent models, and implicitly the likelihood ratio test statistics, we

find no evidence of periodic coefficients that are related to the cycle component. After

maximum likelihood estimation of parameters, the loglikelihood value is 5691.13 which

is not significantly different at the 5% level from log-likelihood values for the models

2, 3, 4 and 5. Also the information criteria AIC, AICc and BIC all point to the more

parsimonious Model 6.

Finally, we have a closer look at the periodic coefficients for the trend and seasonal

components. Model 7a is Model 6 with the seasonal variances kept constant for differ-

ent quarters while Model 7b is Model 6 with the trend variances kept constant. The

log-likelihood values for both Models 7a and 7b, after estimation, reduce significantly

according to the 5% critical value of χ2 distribution with 21 degrees of freedom. It

indicates that both the trend and seasonal components have periodic coefficients. To

confirm this finding further, we have also considered the multivariate UC model with-

out any periodic coefficient, Model 8. For this model, the loglikelihood value, after

parameter estimation, has also clearly and significantly reduced. Therefore, we prefer

to work with Model 6. This MPUC model has also the lowest AIC and AICc values as

reported in Table 3.2. However, the Schwartz Bayesian information criterion points to

Model 8. We therefore investigate the estimation results for Model 6 more closely.

Table 3.4 presents the estimated periodic variances for the trend and seasonal com-

ponents of Model 6. We find that for certain sectors in the economy, trend and seasonal

variances are significantly different for different quarters. The likelihood ratio test

statistic for the hypothesis (3.10) points to significant trend periodicity for the sectors

ND, NR and G; significant seasonal periodicity is found for the ND, C and TTU sectors.

The other sectors have variances that are not significantly different for different quar-

ters. The non-durable production sector appears to be subject to periodic coefficients

most strongly. We expect it is caused by the presence of processed food and agricultural

industries in the sector ND. The reason for periodic trend variances for the Government

sector may be due to the seasonal work for the census surveys and the popular scheme

of “9-month-job” contracts that are granted by the state and local governments for the

employment of public school teachers and faculty members of state universities. For

completeness, we report in Table 3.5 the coefficient estimates for the common cycle

and the irregular component. These results confirm our earlier findings: positive cycle

loading estimates for all series except for the Government sector and small estimates for

Page 69: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 61

the irregular variances. As we have presented coefficient estimates that are very similar

under different model specifications, we regard our findings as robust to possible model

misspecification.

Given the results presented in Table 3.4, we can reduce the number of coefficients

in Model 6 by enforcing periodic coefficients for a selection of sectors only. However,

we prefer to keep the main structure of our model simple. We therefore take Model 6

as our preferred model. It implies that the multiple time series of employment figures

for seven sectors in the U.S. economy are decomposed by a multivariate unobserved

components time series model with idiosyncratic trend, seasonal and irregular effects, a

single common cycle with its variance adjusted for the Great Moderation, and periodic

variances for the disturbances driving the trend and seasonal components.

3.4.2 UC decomposition in the final model

The main motivation of this study is to investigate the dynamic properties of U.S. em-

ployment and the common cyclical behaviour in the time series. We have concluded

that Model 6 is the most appropriate specification within the general class of MPUC

models introduced in Section 3.2. To gain further insights into the dynamic properties

of the series, we present graphs of the estimated components. These estimates and their

confidence intervals can be constructed from Kalman filter and smoother methods.

Trend extraction

Figure 3.2 presents the estimated trend level (µi,t) from Model 6. Since the trend

component is specified without the level disturbance, the slope represents the growth

rate of the trend function, that is βi,t = ∆µi,t. Also it leads to smooth trend estimates

as is apparent from Figure 3.2. None of estimated trends in Figure 3.2 are exactly the

same or appear to be common. The estimated trends of TTU, OS and G appear to

have similar upward sloping trends although differences are visible. The trends in the

manufacturing sectors (D and ND) have been moving downwards since the late 1990s

after a more stable development since the 1970s. The ND trend has been affected by the

low employment numbers during the recession period of the mid-1970s. The pattern of

the NR trend is somewhat atypical compared to the trends for the other sectors. Since

employment in the energy sector is highly dependent on the oil market and the oil price,

its trend seems more dictated by specific energy market conditions rather than general

economic conditions. The NR trend appears to follow smoothly the main movements

of oil prices. The estimated trend in the construction sector appears to pick up various

business cycle features. It may indicate that the business cycle component in C is not

similar to the imposed common business cycle or that it is not coincidental. The cycle

Page 70: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

62 CHAPTER 3. MULTIVARIATE PUC MODELS

Table 3.4: Estimated standard deviations for the trend and seasonal variances in the

MPUC Model 6

Trend Q1 Q2 Q3 Q4 χ2 test

σζ,D 0.0028 0.0059 0.0028 0.0061 4.38

(0.0019) (0.0010) (0.0018) (0.0011)

σζ,ND 0.0039 0.0035 0.0030 0.0054 12.06

(0.0006) (0.0006) (0.0007) (0.0007)

σζ,NR 0.0166 0.0000 0.0145 0.0086 8.22

(0.0027) (0.0156) (0.0023) (0.0037)

σζ,C 0.0091 0.0079 0.0046 0.0102 4.58

(0.0017) (0.0014) (0.0021) (0.0017)

σζ,TTU 0.0018 0.0000 0.0018 0.0012 5.86

(0.0003) (0.0009) (0.0003) (0.0006)

σζ,OS 0.0010 0.0014 0.0007 0.0015 0.04

(0.0004) (0.0003) (0.0005) (0.0003)

σζ,G 0.0037 0.0000 0.0031 0.0019 11.08

(0.0005) (0.0015) (0.0005) (0.0008)

Seasonal Q1 Q2 Q3 Q4 χ2 test

σω,D 0.0000 0.0003 0.0000 0.0000 1.32

(0.0003) (0.0002) (0.0003) (0.0002)

σω,ND 0.0009 0.0000 0.0006 0.0004 12.64

(0.0002) (0.0002) (0.0002) (0.0002)

σω,NR 0.0020 0.0012 0.0009 0.0060 5.04

(0.0009) (0.0009) (0.0014) (0.0023)

σω,C 0.0000 0.0047 0.0048 0.0039 13.52

(0.0016) (0.0006) (0.0006) (0.0008)

σω,TTU 0.0000 0.0010 0.0008 0.0008 13.24

(0.0004) (0.0002) (0.0002) (0.0001)

σω,OS 0.0003 0.0007 0.0005 0.0003 5.38

(0.0001) (0.0001) (0.0001) (0.0001)

σω,G 0.0013 0.0013 0.0008 0.0003 6.67

(0.0003) (0.0003) (0.0002) (0.0008)

Notes: The estimated standard deviations are reported with their asymptotic standard errors in paren-

theses below. The critical value for the χ2 tests with 3 degrees of freedom is 7.81 at the 5% significance

level and 6.25 at the 10% significance level. Significant test statistics at 5% are given in bold.

Page 71: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 63

Table 3.5: Estimates of non-periodic components from Model 6: cycle and irregular

Cycle Irregular

est.par s.e. est.par s.e.

ρ 0.9526 0.0139 σε,D 0.0025 0.0004

θND 0.3988 0.0286 σε,ND 0.0005 0.0007

θNR 0.6063 0.0944 σε,NR 0.0079 0.0010

θC 0.5449 0.0677 σε,C 0.0000 0.0022

θTTU 0.2773 0.0161 σε,TTU 0.0007 0.0002

θOS 0.2269 0.0129 σε,OS 0.0005 0.0002

θG -0.0336 0.0202 σε,G 0.0014 0.0004

σκ,I 0.0135 0.0010

σκ,II 0.0029 0.0005

Period in years 4.8115 0.2234

in construction may be leading other sectors in the economy as it is partly associated

with investments (a leading indicator) in the economy. From the data and estimated

trend plots we further learn that the C series is the most seasonal.

For private services series (TTU and OS), the estimated trend is slowly increasing

while a cycle component is not clearly present. Service sectors such as schools and

hospitals appear to be affected by recessions only weakly. Similarly, the employment in

government (G) has shown a steadily increasing trend without much cyclical variation

in the time series.

Seasonal effects

Figure 3.3 presents the estimated seasonal components for each time series. The sea-

sonal patterns and their impact are different for each series. It is interesting to view

the changes in the seasonal patterns over the years. The seasonal effects for the em-

ployment series ND and C diminish in the more recent years while for the NR, TTU,

OS, and G series the seasonal effects become somewhat stronger. A structural break in

the seasonality of the TTU and OS series appears at the beginning of the 1990s. The

estimated seasonality in the D series is stable in general, the pattern changes mostly

for quarter 3, the summer holiday months. The seasonal component has its highest

impact for employment in the construction sector C, the effect explains overall 10% of

the variation in the time series; the second largest impact is for the government sector

G with 2.5%.

Page 72: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

64 CHAPTER 3. MULTIVARIATE PUC MODELS

(a) trend (a)

1960 1980 2000

9.00

9.25

9.50(a) trend (a) (b) trend (b)

1960 1980 2000

8.6

8.8(b) trend (b)

(c) trend (c)

1960 1980 2000

6.50

6.75

7.00

7.25(c) trend (c) (d) trend (d)

1960 1980 2000

8.0

8.5

9.0(d) trend (d)

(e) trend (e)

1960 1980 2000

9.5

10.0(e) trend (e) (f) trend (f)

1960 1980 2000

10

11 (f) trend (f)

(g) trend (g)

1960 1980 2000

9.0

9.5

10.0 (g) trend (g)

Figure 3.2: Data and trend estimates computed by the Kalman filter and smoother based on

Model 6. The decompositions are given for each employment series : (i) D = durable goods

manufacturing; (ii) ND = non-durable goods manufacturing; (iii) NR = natural resources

industries; (iv) C = construction businesses; (v) TTU = trade/transport/utility industries;

(vi) OS = other service providing industries; (vii) G = aggregated government sector.

Page 73: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 65

(a)

1960 1980 2000

-0.005

0.000

0.005 (a) (b)

1960 1980 2000

-0.02

0.00

0.02 (b)

(c)

1960 1980 2000

-0.025

0.000

0.025(c) (d)

1960 1980 2000

-0.1

0.0

0.1 (d)

(e)

1960 1980 2000

0.000

0.025(e) (f)

1960 1980 2000

0.00

0.02(f)

(g)

1960 1980 2000

-0.025

0.000

0.025 (g)

Figure 3.3: Seasonal component estimates, computed by the Kalman filter and smoother

based on Model 6, for each employment series: (i) D = durable goods manufacturing; (ii)

ND = non-durable goods manufacturing; (iii) NR = natural resources industries; (iv) C =

construction businesses; (v) TTU = trade/transport/utility industries; (vi) OS = other service

providing industries; (vii) G = aggregated government sector.

Page 74: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

66 CHAPTER 3. MULTIVARIATE PUC MODELS

Common cycle estimate

The estimated common cycle for the seven series analysed by the MPUC Model 6

is presented in Figure 3.4(i). The estimated cycle clearly identifies the well-known

recessions in the U.S. economy. The effect of the Great Moderation is also clearly

visible while the depth during the recent financial recession in 2008 is pronounced. The

depth of the employment cycle during the financial crisis reaches the lowest levels (in

logs) of the recessions in 1954, 1958 and 1983. The estimated values of changes in

the cycle can be interpreted as the percentage cyclical change in employment. We can

therefore learn that at the depth of the 2008 recession, the impact of the business cycle

on employment (non-structural) was a reduction of almost 9%. The estimated length

of the cycle is 4.8 years with a standard deviation of more or less 3 months. This length

is typical for economic cyclical changes.

We compare the estimated common cycle with an estimate of the cycle from a uni-

variate UC model with the same components – trend, season, cycle and irregular –

and with a cycle variance adjustment for the Great Moderation, see the discussion in

Section 3.2. The univariate model is adopted to extract the cycle estimate for total

U.S. employment. The estimated univariate cycle is presented in Figure 3.4(ii). The

pattern of this cycle is very similar to the estimated common cycle from the MPUC

Model 6. However, the amplitude of the estimated univariate cycle is different; it is

half the amplitude of the estimated common cycle. For example, the depth of the 2008

recession due to the univariate cycle is estimated as almost 2%. To emphasize the

differences, we present both estimated cycles in Figure 3.4(iii). These differences illus-

trate that the impact on employment of the business cycle is clearly different amongst

different sectors in the economy.

3.4.3 Residual diagnostics

The one-step ahead prediction residuals from the MPUC Model 6 after estimation

should be serially uncorrelated when the model is correctly specified. We measure the

serial correlation in the seven (standardised) residual series by the sample autocorrela-

tion function which is presented in Figure 3.5 for each residual series. We observe weak

residual cyclicality in all seven series, but their presence is not significant according to

the asymptotic error bands based on 95% confidence levels. However, the weak but

persistent appearance of cyclicality in the residuals may indicate that an idiosyncratic

cyclical dynamics may be present in the time series. For example, in the employment

for durable goods manufacturing (D) we may extend the analysis with an idiosyncratic

cycle component.

The first-order serial correlation appears significant in the residuals for D while the

Page 75: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.4. EMPIRICAL RESULTS FOR U.S. EMPLOYMENT SECTORS 67

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010

-0.05

0.00

0.05

(i)

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010

-0.025

0.000

0.025(ii)

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010

-0.05

0.00

0.05

(iii)

Figure 3.4: The estimated common cycle in comparison with the univariate cycle estimate

for total U.S. employment: (i) The estimated common cycle from the MPUC Model 6; (ii) the

estimated cycle from a univariate UC model for total U.S. employment; (iii) both estimated

cycles in one plot.

Page 76: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

68 CHAPTER 3. MULTIVARIATE PUC MODELS

second-order serial correlation appears significant for the residual series in ND and G.

We therefore may need to investigate the dynamic properties for these series in more

detail. However, for the purpose of this study we are pleased with these results. We like

to emphasize that our current model has a basic time series decomposition structure.

Residual diagnostic statistics for tests against normality and heteroskedasticity are

also computed and investigated. These statistics are computed for all seven residual

series. Overall these statistics were satisfactory although the normality assumption for

the ND, C and TTU sectors can be rejected based on their skewness and kurtosis. Such

rejections are most likely caused by the existence of outlying observations in the original

time series.

(D)

0 10 20 30

0

1(D) (ND)

0 10 20 30

0

1(ND)

(NR)

0 10 20 30

0

1(NR) (C)

0 10 20 30

0

1(C)

(TTU)

0 10 20 30

0

1(TTU) (OS)

0 10 20 30

0

1(OS)

(G)

0 10 20 30

0

1(G)

Figure 3.5: The sample autocorrelation functions for the seven standardized residual series

from MPUC Model 6 with asymptotic error bands based on 95% confidence levels (±2/√

240).

3.5 Comparison with the Krane and Wascher study

One of the motivations of our empirical study in Section 3.4 is to study the empirical

findings of Krane and Wascher (1999) in more detail and for a new and extended

dataset of sectoral U.S. employment time series. The Krane and Wascher (KW) study

is based on a multivariate UC time series model with periodic coefficients, similar to the

Page 77: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.5. COMPARISON WITH THE KRANE AND WASCHER STUDY 69

model used in our study. However, the KW results are obtained by adopting a different

model specification (not discussed in Section 3.2), using different time series (in log-

differences, employment growth) for a different set of sectors (nine sectors instead of

our seven sectors) and for a different sample (quarterly time series from 1953 to 1989).

Although the structure of their decomposition model is similar, the details of their study

are different. For example, in comparison with the KW study, we have considered a

larger set of periodic coefficients in the study of Section 3.4 and we have estimated the

coefficients simultaneously without relying on ad-hoc pre-analyses.

In this section we reconsider the KW analysis and compare our main findings with

those of KW. For this purpose we adopt a MPUC model specification that is close to the

model used in KW; it contains an idiosyncratic trend component without periodic coef-

ficients, idiosyncratic seasonal and irregular components without periodic coefficients,

and a common cycle component but with periodic coefficients. This model specification

is labelled as MPUC Model KW and has not been considered in Section 3.4.

In our analysis we consider seven sectors instead of the nine sectors that are used by

KW. This difference can be explained for the following reasons. First, employment in

the sector of “motor vehicle manufacturing industry” is considered by KW but is not

provided as a complete time series by the BLS. The North American Industry Classifi-

cation System (NAICS) has included the “motor vehicle” sector in the overall sector of

“total durable goods manufacturing industry”. We therefore have not considered this

sector separately in our analysis. Second, we have pooled the government employment

data into one sector instead of three separate sectors. After a separate pre-analysis of

the three government employment series (federal, state and local), we have concluded

that the dynamic features of the three series have been very close to each other. There-

fore we have taken this group of employment series together and have analysed this

sector jointly with the other six sectors.

Our main findings as reported and discussed in Section 3.4 differ from those of

KW. The most pronounced difference is that KW found significant statistical evidence

of periodic coefficients for the common cycle component. It implies an interaction

between seasonal and cyclical features in the data. In our study, we have not found

such interactions. By starting with MPUC Model 1 where all coefficients in the model

are periodic (except the cycle length), we have not found statistical evidence of periodic

coefficients that are related with the common cycle component. We have only found

that the trend and the seasonal variances are periodic for employment in five sectors of

the economy.

The possible reasons for the different findings may be explained as follows. Firstly,

our model is specified differently and therefore possible periodic effects in U.S. employ-

ment series are being processed differently during parameter estimation. The MPUC

Page 78: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

70 CHAPTER 3. MULTIVARIATE PUC MODELS

Model 1 has allowed for periodicity in the trend and seasonal components while KW

has no periodicity in these components. The only periodic component in the KW model

is the cycle component. In our specification, the cycle component turns out to be non-

periodic since all the periodic properties in the time series appear to be sufficiently

captured by the periodic trend and seasonal components. The periodic properties in

the data can only be captured by the KW model through the cycle component. Sec-

ondly, the Great Moderation for the cycle variance was not taken into account in KW.

Empirical studies on the statistical evidence of the Great Moderation such as those in

Kim and Nelson (1999), McConnell and Perez-Quiros (2000), Stock and Watson (2002),

Sensier and van Dijk (2004), and Kim et al. (2004) have appeared in the economic lit-

erature some years after the research of KW. We have shown in our study that the

specification and timing of the cycle variance moderation phenomenon have a major

impact on the estimation of coefficients.

When we consider the MPUC Model KW, we do not find statistical evidence of

periodic coefficients for the cycle component using the dataset from Section 3.3. Before

making any conclusion on the different findings between KW and our studies, we re-

estimate the coefficients for MPUC Model KW using the data sample used in the

KW study. This sample is between 1953:Q1 and 1989:Q4. The estimation results for

this sample appear to indicate that the periodic cycle coefficients are distinct from

each other. In Table 3.6 we present the values of the χ2 tests, with three degrees of

freedom, related to the null-hypothesis (3.10) for the individual cycle coefficients. The

test values from the KW study, as reported by Krane and Wascher (1999), and those

from our MPUC Model KW are presented. Although the values of the test statistics

from the two studies are different, similar conclusions can be taken. The cycle loadings

for the sectors TTU and G are not periodic but all other estimated cycle coefficients are

significantly periodic at 10% significance level. We therefore cannot reject the findings of

KW. However, our earlier conclusion remains since when we consider a more elaborate

periodic model, such as our preferred MPUC Model 6, the cycle coefficients are not

estimated as being significantly periodic, even in this specific data sample.

As our empirical findings appear to be sensitive to different samples of our dataset,

we repeat our analyses for the MPUC Models 6 and KW for different samples. We

re-estimate all coefficients for each model using data samples that all start in 1950:Q1

but end in 1989:Q4 for the first sample, in 1990:Q4 for the second sample, etcetera,

up to the 21st sample that ends in 2009:Q4. We first consider our final Model 6 and

investigate whether the trend variances are periodic in all samples. For each sample,

we compute the combined test statistic for the null hypothesis that all trend variances

in Model 6 are not periodic. This χ2 test has 21 degrees of freedom since we have

seven trend variances. Figure 3.6(i) presents the values of these test statistics for each

Page 79: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.5. COMPARISON WITH THE KRANE AND WASCHER STUDY 71

Table 3.6: Periodic tests for the cycle coefficients

KW MPUC Model KW

χ2 test χ2 test

ρ 15.86 7.71

θND 22.13 10.62

θNR 16.35 9.36

θC 8.67 7.66

θTTU 2.92 0.58

θOS 8.88 6.82

θG 2.58 3.22

Notes: This table presents test statistics for the null hypothesis (3.10) that coefficients are not periodic.

The KW test statistics are computed by Krane and Wascher (1999) while those for MPUC Model KW

are computed using the data sample as considered by Krane and Wascher (1999), that is from 1953:Q1

to 1989:Q4. The critical value for the χ2 tests with 3 degrees of freedom is 7.81 at the 5% significance

level and 6.25 at the 10% significance level.

sample that is indicated by the year in which the sample ends. The critical value

for the χ2 tests with 21 degrees of freedom is 33.0 at the 5% significance level and

is indicated by the vertical line. The results show clearly that the null hypothesis of

non-periodic trend component is clearly rejected for all samples except for the first two.

We therefore prefer to have periodic trend variances in our MPUC model specification.

The computations are repeated for the null hypothesis where the seasonal variances are

not periodic. These test statistics are reported in Figure 3.6(ii) and the conclusion is

even stronger. In each sample, we prefer to have seasonal variances that are different

for different quarters in the MPUC Model 6.

We also compute the joint periodic χ2 test for the cycle coefficients in the MPUC

Model 2 which is equivalent to MPUC Model 6 only with the cycle coefficients periodic.

Here we only test the seven coefficients in Table 3.6 on periodicity and compute the

joint χ2 test that has 21 degrees of freedom. The computations are repeated for all

subsamples and the results are presented in Figure 3.6(iii). It is clear that in all samples

we reject periodic cycle coefficients in the MPUC Model 2.

We repeat these computations for the MPUC Model KW where only the cycle

coefficients are periodic. The joint χ2 tests for this case are presented in Figure 3.6(iv).

Here the test statistics are closer to the critical value of rejecting the null hypothesis that

cycle coefficients are not periodic with a 5% significance level. However, the evidence is

much less convincing compared to the periodic tests for the trend and seasonal variances

in MPUC Model 6. Also, in terms of information criteria, such as those of Akaike and

Page 80: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

72 CHAPTER 3. MULTIVARIATE PUC MODELS

Schwartz, the preference is clearly for Model 6 in which we do not have periodic cycle

coefficients.

1990 1995 2000 2005 2010

20

40

60(i)

1990 1995 2000 2005 2010

20

40

60(ii)

1990 1995 2000 2005 2010

20

40

60(iii)

1990 1995 2000 2005 2010

20

40

60(iv)

Figure 3.6: The joint χ2 tests for different sets of coefficients, for different MPUC models

and for different samples. In all panels the tests are presented as as series of bars for samples

starting in 1950:Q1 and ending in the year as indicated on the x-axis quarter 4. Panel (i)

presents the test values for the joint hypothesis with a null in which the trend variances

are not periodic in MPUC Model 6; panel (ii) is for a null where the seasonal variances are

not periodic in MPUC Model 6; panel (iii) is for a null where the cycle coefficients are not

periodic in MPUC Model 2; panel (iv) is for a null where the cycle coefficients are not periodic

in MPUC Model KW.

The final part of our comparison with KW consists of an assessment of the fore-

casting performance of the different MPUC models. Our aim here is not primarily to

produce the most accurate forecasts; rather, the question we address here is whether

the main dynamics in the data are captured sufficiently well by our MPUC models in

comparison with the approximated KW model. Therefore we do not implement model

modifications that may improve forecast accuracy (such as relaxing the common cy-

cle restriction). We consider four models in this forecasting study, namely the MPUC

Model 1 (complete multivariate periodic model), 6 (our final multivariate model), 8

(multivariate non-periodic model) as described in Table 3.2 and the MPUC Model

KW. All models account for the Great Moderation with respect to the cycle variance.

Page 81: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.6. SUMMARY AND CONCLUSION 73

To evaluate the out-of-sample forecast performances for these four models, we com-

pare their one-step-ahead forecasts over a period of ten years. The forecasting design

is as follows. We estimate the coefficients for the four models by maximum likelihood

using the data set from 1950:Q1 up to 1999:Q4, use the Kalman filter to produce one-

step-ahead forecast, and compare it with the historical realisation. Subsequently we

expand the data set with this realisation (keeping the first observation 1950:Q1 fixed),

re-estimate the model coefficients, and produce a new one-step-ahead forecast. We

repeat this procedure until 2009:Q4.

The forecast results are reported in Table 3.7 and they show that the MPUC Model

KW performs the best according to Root Mean Squared Error (RMSE) and the Mean

Absolute Percentage Error (MAPE) criteria. However, the differences in MAPE and

RMSE between the models are small. Using the forecast accuracy test of non-nested

models by Diebold and Mariano (1995), we generally find insignificant differences be-

tween the forecasts generated by our MPUC model and MPUC Model KW, see the

bottom part of Table 3.7. Notation Si,KW is Diebold-Mariano test applied to MPUC

Model i and MPUC Model KW, for i = 1, 6, 8. We have only found one case (employ-

ment in the government sector) where the test shows a significant difference.

In Figure 3.7 we plot the one-step-ahead forecasts for period 2000:Q1 until 2009:Q4

for the final MPUC Model 6 and the MPUC Model KW. We notice that the first eight

and half years are well predicted, but starting 2008:Q4 the forecasts cannot not keep

up with the declining employment in the labour market. This is not very surprising,

as 2008:Q4 marked the start of the most acute phase of the recent financial crisis. All

private sectors were severely affected, with employment in the services sectors (TTU

and OS) and construction (C) showing the steepest declines. The only sector that was

not affected by the crisis was Government employment. There are indications that the

declines in employment levelled off near the end of 2009 in almost all sectors except for

construction businesses.

3.6 Summary and conclusion

We have discussed a multivariate extension of a class of periodic unobserved compo-

nents time series models and have shown that the parameters driving the stochastic

components of the model can be estimated by the method of maximum likelihood. The

empirical study concerns a dataset of seven U.S. sectoral employment time series. We

have shown that the seven time series of employment can be effectively modeled by a

MPUC model with idiosyncratic components for trend, seasonal and irregular together

with a common cyclical component. The common cycle can be interpreted as the busi-

ness cycle that affects the U.S. macro-economy. A particular feature of the U.S. business

Page 82: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

74 CHAPTER 3. MULTIVARIATE PUC MODELS

Table 3.7: Accuracy test for one-step-ahead out-of-sample forecasting between MPUC

models from 2000:Q1 to 2009:Q4 for the employment sub-series.

RMSE

Sectors Model 1 Model 6 Model 8 Model KW

D 0.01230 0.01214 0.01218 0.01140

ND 0.00576 0.00584 0.00606 0.00597

NR 0.01776 0.01799 0.01747 0.01681

C 0.01211 0.01224 0.01217 0.01200

TTU 0.00504 0.00485 0.00477 0.00445

OS 0.00420 0.00414 0.00398 0.00379

G 0.00458 0.00469 0.00475 0.00440

MAPE

Sectors Model 1 Model 6 Model 8 Model KW

D 0.00093 0.00091 0.00093 0.00082

ND 0.00048 0.00048 0.00048 0.00047

NR 0.00188 0.00192 0.00189 0.00178

C 0.00103 0.00102 0.00102 0.00100

TTU 0.00038 0.00037 0.00037 0.00034

OS 0.00027 0.00026 0.00026 0.00025

G 0.00033 0.00034 0.00033 0.00031

Diebold-Mariano

Sectors S1,KW S6,KW S8,KW

D 0.5890 0.5966 0.6570

ND -0.2944 -0.2215 0.1753

NR 0.6353 0.7425 0.6057

C 0.1874 0.3854 0.4048

TTU 1.6940 1.3324 1.5148

OS 1.1086 0.9611 0.6640

G 0.9596 1.7752 2.3513

Notes: Data sample used for parameters estimation run from 1950:Q1 until 1999:Q4. Model 1, 6

and 8 are our MPUC models as described in Table 3.2. Model KW is our approximation of Krane

and Wascher (1999). Si,j are the Diebold-Mariano (DM) tests for equal one-step-ahead MSE forecast

accuracy of two non-nested models where asymptotically each Si,j ∼ N(0, 1). The critical value at 5%

significance level for the DM test is given at ±1.96. Significant positive value for Si,j indicates that

forecasts generated by model i are less accurate than the forecasts from model j. Si,KW is DM-test

applied to model i and model KW, for i = 1, 6, 8. RMSE = Root Mean Square Error. MAPE = Mean

Absolute Percentage Error. Lowest RMSE and MAPE for the four models and significant values for

DM-test are given in boldface.

Page 83: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.6. SUMMARY AND CONCLUSION 75

2000 2005 2010

9.00

9.25

9.50 log(D) Forc_M6 Forc_KW

2000 2005 2010

8.50

8.75

9.00log(ND) Forc_M6 Forc_KW

2000 2005 2010

6.5

6.7log(NR) Forc_M6 Forc_KW

2000 2005 2010

8.7

8.8

8.9

9.0log(C) Forc_M6 Forc_KW

2000 2005 2010

10.15

10.20 log(TTU) Forc_M6 Forc_KW

2000 2005 2010

11.00

11.05

11.10 log(OS) Forc_M6 Forc_KW

2000 2005 2010

9.95

10.00

10.05log(G) Forc_M6 Forc_KW

Figure 3.7: One-step-ahead out-of-sample forecasts for 2000:Q1 up to 2009:Q4 using MPUC

Model 6 and MPUC Model KW.

Page 84: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

76 CHAPTER 3. MULTIVARIATE PUC MODELS

cycle is the Great Moderation in which the shocks that drive the cycle have moderated

in the period from the early 1980s in comparison with the earlier period. We have

allowed for this phenomenon by having a different (smaller) cycle variance for the mod-

eration period. We have decided to end the Great Moderation period at the beginning

of the financial crisis in 2007 quarter 4.

By adopting this MPUC modeling framework, we have looked for a more sophis-

ticated model specification by adopting the general-to-specific modeling strategy. It

means that we started by having a MPUC model with all coefficients being periodic.

Such a specification contains many coefficients in a multivariate model context but we

have shown that we are able to estimate all coefficients by the method of maximum like-

lihood. A crucial ingredient of this successful implementation is the ability to compute

the exact score function analytically for each coefficient.

The general-to-specific modeling strategy has led us to conclude that an appropriate

model specification for this important dataset is the MPUC model with only periodic

variances for the disturbances that drive the seven idiosyncratic trend and seasonal

components. We have not found significant evidence of periodic coefficients for the cycle

component. This finding is in contrast with the thorough study of Krane and Wascher

(1999), since we allowed for periodic variances in the trend and seasonal components,

the cycle coefficients do not appear to be significantly periodic. We have established

periodic effects in the cycle coefficients when the trend and seasonal variances are not

specified as periodic. The statistical evidence is however weak. A model with periodicity

only in the cycle is close to the model specification considered by Krane and Wascher

(1999). However, the model with periodic trend and seasonal components finds more

support in the data.

Although we have revisited the study of Krane and Wascher (1999) and presented

this as a main motivation of this study, we also regard our study as an illustration that a

multivariate periodic unobserved components model can be considered as a viable model

for analysing multiple seasonal time series. The advances in computer technology and

the efficient software implementation of state space time series methods have enabled

us to use extensive model-based frameworks in the practice of economic time series

analysis.

Page 85: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

3.7. APPENDIX 77

3.7 Appendix

MPUC model in state space form

In this Appendix we show how the MPUC model of Section 3.2 can be formulated as a

state space model. We adopt the state space formulation of de Jong (1991), that is

αt+1 = Ttαt +Htεt, α1 ∼ N(a, P ), t = 1, . . . , n (3.12)

yt = Ztαt +Gtεt, εt ∼ NID(0, I), (3.13)

where yt is the N × 1 observation vector, αt is the m× 1 state vector, and the system

matrices Tt, Ht, Zt and Gt can be time-varying to allow for periodicity. The initial state

vector is α1 with mean vector a and variance matrix P . The transition matrix Tt is

given by

Tt =

Tlevel 02N×3N 02N×2

03N×2N Tseason 03N×2

02×2N 02×3N Tcycle

(3.14)

where the sub-matrices are structured as follows

Tlevel =

(IN IN

0N×N IN

)Tseason =

−IN −IN −ININ 0N×N 0N×N

0N×N IN 0N×N

Tcycle = ρt

(cosλ sinλ

− sinλ cosλ

).

(3.15)

The measurement matrix Zt is given by

Zt =(IN 0N×N IN 0N×N 0N×N Θt

)(3.16)

with vector Θt is defined in (3.7) and the state vector is given by

αt =(µ′t, β

′t, γ

′t, γ

′t−1, γ

′t−2, ψt, ψ

∗t

)′.

The initialisation of the state vector is diffuse for the non-stationary component. For

the periodic cycle component we use the initialisation as explained in Koopman et al.

(2009). The variance matrix of the state disturbances, HtH′t = Var(Htεt) is given by

HtH′t = diagonal

(Ση,t,Σζ,t,Σω,t, 0N×N , 0N×N , σ

2κI2

), (3.17)

and the variance matrix of the measurement disturbances is given by GtG′t = Σε,t. Note

that the unknown coefficients Θt, ρt, σ2κ,t, Ση,t, Σζ,t, Σω,t and Σε,t are time-varying as

fixed functions of time index t.

Page 86: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

78 CHAPTER 3. MULTIVARIATE PUC MODELS

Page 87: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Chapter 4

Periodic SARIMA models

This chapter is based on Hindrayanto, Koopman, and Ooms (2010).

4.1 Introduction

Seasonal time series with sample autocorrelation functions that change with the season

are referred to as periodic time series. To enable the identification of such dynamic char-

acteristics in a time series, Gladysev (1961) and Tiao and Grupe (1980) have formally

defined periodic autocorrelations using a stationary vector representation of periodic

univariate time series. Once periodic properties of a time series are detected, the time

series analyst can consider time series models that allow for these periodic correlations.

A model-based periodic time series analysis becomes effective when appropriate meth-

ods and algorithms are developed for estimation and diagnostic checking. This is the

primary aim of this chapter.

The dynamic properties of a particular seasonal time series model are governed by

parameters that are usually assumed fixed throughout a given time period. In the con-

text of autoregressive moving average (ARMA) models, the parameters associated with

the autoregressive (AR) and moving average (MA) lag polynomials are usually assumed

fixed; see, for example, Brockwell and Davis (1993). In the context of unobserved com-

ponents (UC) time series models, the parameters driving the stochastic processes for

the components are usually fixed as well; see, for example, Harvey (1989) and Kitagawa

and Gersch (1996). In case these parameters are allowed to be deterministic functions

of the season index, the model becomes part of the class of periodic linear time series

models.

Various developments on periodic time series are given in the statistics and econo-

metrics literature. Maximum likelihood estimation methods for periodic ARMA models

have been discussed by Vecchia (1985), Li and Hui (1988), Jimenez et al. (1989) and

Lund and Basawa (2000). Furthermore, many environmental and economic studies

Page 88: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

80 CHAPTER 4. PERIODIC SARIMA MODELS

have given empirical evidence that time series models require periodically changing

parameters; see, for example, Osborn (1988), Osborn and Smith (1989), Bloomfield

et al. (1994), Ghysels and Osborn (2001) and Franses and Paap (2004). In Chapter 2

we have develop feasible maximum likelihood procedures for a class of periodic unob-

served components (PUC) time series models with stochastic trend, cycle and seasonal

components.

In this chapter we present a convenient time-varying (univariate) state space rep-

resentation of periodic seasonal autoregressive integrated moving average (PSARIMA)

time series models to enable exact maximum likelihood estimation. The time-invariant

(multivariate) representation is only used for identification analysis. We focus the analy-

ses of the non-stationary time series without a-priori differencing. The initial conditions

for the non-stationary parts of the models are treated by an exact initial Kalman filter.

For the PSARIMA model, we adopt a modified Kalman filter to compute the initial

variance-covariance matrix of the stationary part of the time series. Once the initiali-

sations are treated appropriately, exact maximum likelihood estimation can be carried

out using the standard Kalman filter. Standard software tools are available for our

solution. The development of new software is not required.

The remainder of the chapter is organised as follows. In Section 4.2 we discuss

general PSARIMA models in details and their state space formulations. In Section 4.3

we give details on the initialisation of PSARIMA models. In Section 4.4 we apply both

PSARIMA and PUC models (from Chapter 2) to monthly postwar U.S. unemploy-

ment series and compare the results with their non-periodic counterpart. Section 4.5

concludes and gives ideas for further research. The appendix contains the state space

representation of a specific PSARIMA model.

4.2 Periodic SARIMA models in state space repre-

sentation

Maximum likelihood estimation of annual stationary periodic ARMA models has been

discussed by Vecchia (1985) (using the conditional likelihood), Li and Hui (1988) (ex-

tending the exact likelihood of Ansley (1979)), Jimenez et al. (1989) (using a state

space method for the exact likelihood), Lund and Basawa (2000) and others. In this

subsection we discuss the PSARIMA model and its state space representation in order

to construct the exact likelihood function.

There are different state-space representations of PSARIMA models. This chapter

focuses on the univariate time-varying state space form for PSARIMA(p, d, q)(P,D,Q)S

models. The multivariate time-invariant state space representation following the idea

of Gladysev (1961) is not attractive from a computational point of view since the

Page 89: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.2. PERIODIC SARIMA MODELS IN STATE SPACE REPRESENTATION 81

dimension of the state-space matrices can be large for modest orders of p, q and S.

We show our state space analysis for a general periodic version of SARIMA model.

In this model, the observations yt follow a PSARIMA(p, d, q)(P,D,Q)S process given

by

φp,s(L)ΦP,s(LS)(1− L)d(1− LS)Dyt = θq,s(L)ΘQ,s(L

S)εt, εt ∼ NID(0, σ2ε,s), (4.1)

for t = SD + d+ 1, . . . , n and s = 1, . . . , S. We define L as the lag operator such that

Liyt = yt−i. The periodic lag polynomials are defined as follows,

φp,s(L) = 1− φ1,sL− · · · − φp,sLp, (4.2)

Φp,s(LS) = 1− Φ1,sL

S − · · · − ΦP,sLSP (4.3)

θq,s(L) = 1 + θ1,sL+ · · ·+ θq,sLq, (4.4)

ΘQ,s(LS) = 1 + Θ1,sL

S + · · ·+ ΘQ,sLSQ, (4.5)

where φi,s,Φi,s, θi,s, Θi,s and σ2ε,s are coefficients that vary deterministically across the

S different periods of the year. The seasonal length is S with S = 4 for quarterly data

and S = 12 for monthly data. In this chapter, the differencing operator for the ‘levels’,

d, takes values 0, 1, or 2, while the differencing operator for the ‘seasonals’, D, takes

values 0 or 1. The non-PSARIMA model is retrieved by restricting the parameters to

be equal for all seasons. We denote the differenced time series

y∗t = (1− L)d(1− LS)Dyt, (4.6)

where y∗t is a periodic ARMA(p∗, q∗) process with p∗ = p+ SP and q∗ = q + SQ.

Next we represent the PSARIMA model (4.1) directly in state space form for yt.

Depending on the order of d and D, we can describe yt in terms of yt−i, (1 − L)dyt−i

and y∗t . For example, if d = D = 1, we would have

y∗t = (1− L)(1− LS)yt = (1− L− LS + LS+1)yt,

and after some minor re-arrangements, we get

yt = yt−1 + (1− L)yt−S + y∗t . (4.7)

The stationary part associated with y∗t can be treated similarly as in the periodic

ARMA state-space approach of Jimenez et al. (1989). The expression for yt as in (4.7)

is incorporated directly in the state space framework. As a result, our analysis for yt

starts at t = 1 instead of t = SD+d+ 1. The non-stationary variables such as yt−i and

(1−L)dyt−i are placed in the state vector and treated by diffuse initialisations following

Ansley and Kohn (1985), de Jong (1991), Koopman (1997), Aston and Koopman (2006)

and Koopman et al. (2008).

Page 90: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

82 CHAPTER 4. PERIODIC SARIMA MODELS

The elements of the state space form equations (2.35)-(2.36) are described as follows.

Again, depending of the order of d and D, the state vector αt differs for each case. We

define α∗t as the state vector for d = D = 0 so that y∗t = yt. In this case, we have

α∗t = (y∗t , φ2,sy∗t−1 + · · ·+ φp,sΦP,sy

∗t−p∗+1 + θ1,sεt + · · ·+ θq,sΘQ,sεt−q∗+1, (4.8)

φ3,sy∗t−1 + · · ·+ φp,sΦP,sy

∗t−p∗+2 + θ2,sεt + . . . θq,sΘQ,sεt−q∗+2,

. . . , φp,sΦP,sy∗t−1 + θq,sΘQ,sεt)

′,

with the dimension of α∗t equal to m = max(p∗, q∗ + 1). For other combinations of d

and D, we have the following state vectors:

D = 0; d = 1; αt = (yt−1, α∗t )′, (4.9)

D = 0; d = 2; αt = (yt−1, (1− L)yt−1, α∗t )′, (4.10)

D = 1; d = 0; αt = (yt−1, . . . , yt−S, α∗t )′, (4.11)

D = 1; d = 1; αt = (yt−1, (1− L)yt−1, . . . , (1− L)yt−S, α∗t )′, (4.12)

D = 1; d = 2; αt = (yt−1, (1− L)yt−1, (1− L)2yt−1, . . . , (1− L)2yt−S, α∗t )′, (4.13)

where the term y∗t in the state vector α∗t changes according to the orders of d and D, but

the structure of α∗t stays the same. The MA parameters are included in the disturbance

vector, which is given by

Htεt =(

01×(SD+d), εt+1, θ1,sεt+1, . . . , θm−1,sεt+1

)′. (4.14)

The transition matrix Tt has dimension (SD+ d+m)× (SD+ d+m) and Zt is a row

vector of dimension 1× (SD + d+m). For seasonal models with D = 1, the Tt and Zt

matrices can be defined via the matrix

(Tt

Zt

)=

1 1 01×(S−1) 1 1 0 0 . . . 0

0 1 01×(S−1) 1 1 0 0 . . . 0

0 0 01×(S−1) 1 1 0 0 . . . 0

0 0 IS−1 0 0 0 0 . . . 0

0 0 01×(S−1) 0 φ1,s 1 0 . . . 0

0 0 01×(S−1) 0 φ2,s 0 1 . . . 0...

......

......

.... . .

0 0 01×(S−1) 0 φm−1,s 0 0 . . . 1

0 0 01×(S−1) 0 φm,s 0 0 . . . 0

1 1 01×(S−1) 1 1 0 0 . . . 0

,

d = 2

d = 1

d = 0

(4.15)

where Zt is the last row of the matrix in (4.15) and Tt consists of the remaining rows.

The r × r identity matrix is denoted by Ir and an r × c matrix of zeros is denoted by

0r×c. For d = 2, Tt and Zt are given by the full matrix (4.15), while for for d = 1 and

Page 91: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.3. INITIALISATION OF PERIODIC SARIMA MODELS 83

d = 0 they are given by the block matrices starting from the second row and column and

the third row and column, respectively. The variance matrix of the state disturbances,

HtH′t = Var(Htεt), is given by

HtH′t =

(0(SD+d)×(SD+d) 0(SD+d)×m

0m×(SD+d) H∗tH∗ ′t

), (4.16)

where the m×m stationary part of this variance matrix is given by H∗tH∗ ′t . Since the

measurement equation has no error term, GtG′t = 0. Finally, the mean of the initial

state vector is given by a = E(α1) = 0(SD+d+m)×1 and the corresponding variance matrix

is given by

P =

(κISD+d 0(SD+d)×m

0m×(SD+d) P †m×m

), with κ→∞. (4.17)

The matrix P † is the unconditional variance matrix for the stationary part of the state

vector. The variable κ represents the diffuse initialisation for the non-stationary part

of the state; see the discussion in Koopman (1997).

For models with D = 0, the Tt and Zt matrices can be defined by the following

matrix:

(Tt

Zt

)=

1 1 1 0 0 . . . 0

0 1 1 0 0 . . . 0

0 0 φ1,s 1 0 . . . 0

0 0 φ2,s 0 1 . . . 0...

......

.... . .

0 0 φm−1,s 0 0 . . . 1

0 0 φm,s 0 0 . . . 0

1 1 1 0 0 . . . 0

,

d = 2

d = 1

d = 0

(4.18)

where Zt is the last row of the above matrix and Tt consists of the remaining rows. The

variance matrix of the state disturbances, HtH′t, is then given by (4.16) with D = 0.

Similar to the previous case, the measurement equation has no error term, GtG′t = 0.

The mean of the initial state vector is given by a = E(α1) = 0(d+m)×1 and the variance

matrix of the initial state is given by (4.17) with D = 0.

4.3 Initialisation of periodic SARIMA models

Next we need to compute the m×m (unconditional) variance matrix P † for the station-

ary elements in the initial state vector. We notice that matrix P † is the lower sub-matrix

of P which is defined in (4.17). The standard but inefficient method of solving for P †

Page 92: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

84 CHAPTER 4. PERIODIC SARIMA MODELS

is by using the time-invariant first-order vector autoregressive representation following

Gladysev (1961) and Tiao and Grupe (1980). Computational details of solving P † for

non-periodic stationary ARMA models in state space form are given by, for example,

Harvey (1989). To show that solving for P † using the time-invariant vector method

is not computer memory efficient, we give an example of a periodic MA(1) model for

y∗t . In this case the multivariate time-invariant state space representation has a state

vector of length 2S and is given by

α∗t∗ =(y∗1,t∗ . . . y∗S,t∗ θ1,2ε1,t∗ . . . θ1,SεS−1,t∗ θ1,1εS,t∗

)′,

where y∗i,t∗ indicates the observation at season i of year t∗. The initial variance matrix

P † of α∗1 is the solution of

(I − T ⊗ T )vec(P †) = vec(HH ′), (4.19)

where I is the identity matrix, T is the transition matrix and HH ′ is the variance-

covariance matrix of the state disturbances of the time-invariant state space represen-

tation. In general, solving equation (4.19) requires excessive memory as the length of

the state vector for a periodic MA(q) model is (q+ 1)S. Therefore, solving P † requires

the inversion of matrix (I − T ⊗ T ) which has dimension (q + 1)2S2 × (q + 1)2S2. The

inverse computations become computationally inefficient for high values of q or S.

We suggest an alternative and more general method for the computation of the

necessary sub-matrix P † of P using the Kalman filter without additional analytical

work and without any additional programming effort. First we construct the time-

varying state space matrices Tt, Zt, Ht and Gt as in (4.15)-(4.16) and we take arbitrary

values for the mean vector and variance matrix of the initial state vector α1; for example,

we can take a = 0 and P = I. We define Pt as the (unconditional) variance matrix of

the state vector αt with the same partitioning as in (4.17) so that P †t is also implicitly

defined. Then we apply the Kalman filter that is used for the computation of (2.39)

but here we apply it to a long series of missing observations. The Kalman filter for this

series of missing observations effectively carries out the computations for solving the

set of S matrix equations

Pj+1 = TjPjT′j +HjH

′j, j = 1, 2, . . . , (4.20)

simultaneously with respect to the matrices P †k+1, . . . , P†k+S for j = k, . . . , k + S − 1

with P †k ≡ P †k+S and for any k > S. The solution is obtained recursively as part

of the Kalman filter for missing values. These recursive computations continue until

convergence, that is, when P †k+1 ≈ P †k−S+1 for a large value of k. We use the Frobenius

norm to determine numerical convergence. More specifically, let D∗ be the difference

Page 93: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.4. EMPIRICAL ILLUSTRATION: U.S. UNEMPLOYMENT 85

matrix between two matrices and let d∗ij be the entry of matrix D∗ with i, j = 1, . . . ,m.

Numerical convergence is then reached when

||D∗|| = ||P †k+1 − P†k−S+1|| =

√√√√ m∑i=1

m∑j=1

|d∗ij|2 < δ, (4.21)

with δ very small, say 10−9. Since the variance matrices Pj in (4.20) are associated with

missing values, the actual index j is not relevant but the seasonal index associated with

j is obviously of key importance. Therefore, after convergence, we make sure to take

P † equal to P †k for an index k that satisfies (4.21) and that corresponds to the period of

the first observation in the data set. This solution for P † is equivalent to the solution

method implied by (4.19) and which is only feasible for parsimonious periodic models

without seasonal lags. The recursive method is related to solving Riccati equations, see

Anderson and Moore (1979, p.39, (1.10)-(1.12)). Varga (2008) recently surveyed and

developed extensions to numerically stable solution methods for more general discrete-

time periodic Riccati equations.

To adopt the method for exact maximum likelihood estimation of non-stationary

models, the non-stationary part of P has to be initialised using the diffuse variable κ.

We have implemented this approach using the exact initialisation methods of Koopman

(1997) which are part of the procedures in SsfPack 3.0, see Koopman et al. (2008).

These initialisation methods are also implemented in the time series package RATS, see

Doan (2004).

When we return to our example for the periodic MA(1) model, the 2×1 state vector

of the univariate time-varying state-space framework is given by

α∗t =(y∗t θ1,sεt

)′,

for any seasonal length s. The 2 × 2 initial variance matrix P † can be calculated

analytically, but using the Kalman filter, we only need 5 iterations to solve P † in less

than 1 second. More complex periodic models would need larger number of iterations,

but given the modern computer technology and speed, our method is highly feasible.

4.4 Empirical illustration: U.S. unemployment

4.4.1 Data analysis

In this section we analyse a long monthly time series of U.S. unemployment data using

both PSARIMA and PUC time series models. The data consists of seasonally unad-

justed monthly U.S. unemployment levels (in thousands of persons) for age 16 years or

Page 94: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

86 CHAPTER 4. PERIODIC SARIMA MODELS

over. The estimation sample, from January 1948 until December 2007, has 720 obser-

vations. This series is published by the U.S. Bureau of Labour Statistics (BLS) and is

obtained via http://www.bls.gov/webapps/legacy/cpsatab11.htm.

1960 1980 2000

8

9

log(U.S. unemployment)

1960 1980 2000

8

9

10Jan July

Feb Aug

Mar Sept

Apr Oct

May Nov

June Dec

0 25 50 75 100 125

−0.5

0.0

0.53PCor of dd12 log(unemp)

0 25 50 75 100 125

−0.5

0.0

0.56PCor of dd12 log(unemp)

0 25 50 75 100 125

−0.5

0.0

0.59PCor of dd12 log(unemp)

0 25 50 75 100 125

−0.5

0.0

0.512PCor of dd12 log(unemp)

Figure 4.1: U.S. unemployment series (in logs, seasonally unadjusted), 1948.1−2007.12. Top

row: Monthly time series (left) and annual time series for each month of the year (right)

for s = 1, . . . , 12. Middle and bottom rows: Periodic autocorrelations of ∆∆12yt (monthly

changes in the annual growth rates of U.S. unemployment) for lags of 1 to 120 months, for

s = 3, 6, 9, 12.

Figure 4.1 presents time series plots of log monthly unemployment. The top left

panel shows the data month by month, while the top right panel presents the data in

multivariate form, year by year for each month. The monthly unemployment series is

slowly trending upwards and contains a pronounced cyclical pattern. The annual series

are smoother than the monthly series and clearly have common dynamics.

The middle and bottom rows of Figure 4.1 show a selection of the periodic sam-

ple autocorrelations of annual changes in U.S. unemployment growth rates, ∆∆12yt,

where yt is the log of U.S. unemployment series at time t. An accurate definition of

periodic autocorrelations is given by McLeod (1994). These sample autocorrelation

coefficients are clearly periodic. They differ significantly from month to month. For

example, for March (middle left panel), there is a pronounced short cyclical movement

of approximately 24 months starting with 7 positive autocorrelations, while we see a

Page 95: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.4. EMPIRICAL ILLUSTRATION: U.S. UNEMPLOYMENT 87

much longer cyclical movement for June (middle right panel), starting with 14 negative

autocorrelations. There is also a significant difference between March and June for the

seasonal autocorrelation at lag 12. The periodicity in the autocorrelation structure is

our main motivation to analyse log U.S. unemployment series using periodic time series

models. Other recent studies that analysed the periodic interaction of cyclical changes

and seasonal patterns in other macro-economic time series are for example Van Dijk

et al. (2003) and Matas-Mir and Osborn (2004).

4.4.2 Periodic SARIMA model for U.S. unemployment series

In order to select a suitable SARIMA model form, we initially applied a seasonal differ-

ence filter to the observations. The resulting series does not show a clearly discernible

trend, see Figure 4.2. However, a cycle could be readily perceived, which motivated

the inclusion of an AR(2) term. To take account of some remaining serial correlation,

we included low order MA terms in both the seasonal and non-seasonal polynomials.

This results in a SARIMA(2, 0, 1)(0, 1, 1)S with time-varying mean, which appears to

be satisfactory in fitting the log of monthly U.S. unemployment series.

1950 1955 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005

−0.5

0.0

0.5

1.0D12(log_unemployment)

0 10 20 30 40 50 60 70 80 90 100 110 120

−0.5

0.0

0.5

1.0ACF−D12(log_unemployment)

Figure 4.2: Seasonally differenced log unemployment series (∆12yt) and its ACF.

Let the seasonally differenced time series y∗t = ∆Syt be a PSARIMA(2, 0, 1)(0, 0, 1)S

Page 96: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

88 CHAPTER 4. PERIODIC SARIMA MODELS

process, which we write as

y∗t = βs + φ1,s(y∗t−1 − βs) + φ2,s(y

∗t−2 − βs) + εt + θ1,sεt−1 + Θ1,sεt−S, (4.22)

for t = S + 1, . . . , n, s = 1, . . . , S, with εt ∼ NID(0, σ2ε,s), where βs, φ1,s, φ2,s, θ1,s,

Θ1,s and σ2ε,s are coefficients that vary deterministically across the S different periods of

the year. Note that in model (4.22), we omit the term θ1,sΘ1,sεt−(S+1) from the formal

SARIMA(2, 0, 1)(0, 0, 1)S model. Further we rewrite model (4.22) in terms of the levels,

yt, as follows

yt = βs + yt−S + y†t , (4.23)

where y†t = y∗t −βs = ∆Syt−βs = yt−yt−S−βs and we cast the PSARIMA model (4.22)

directly in a state space form for yt with t = 1, . . . , n. The state space matrices are

given in Appendix. In the PSARIMA model, as we only have one variance component,

we allow for different values of the σ2ε,s in the two subsamples, namely σ2

ε,τ,s, where

τ =

{I, for t in the period from 1948.1 to 1981.12

II, for t in the period from 1982.1 to 2007.12

where 1982.1 is the breakpoint year. With S = 12 we have 84 unknown parameters,

βs, φ1,s, φ2,s, θ1,s,Θ1,s, σε,I,s and σε,II,s for s = 1, 2, . . . , 12 including the variance moder-

ation term as described in Chapter 2.

Given the state space form for yt in Appendix, maximum likelihood estimation of

the parameters φ1,s, φ2,s, θ1,s,Θ1,s, σε,I,s and σε,II,s is based on the same procedures as

discussed in Sections 4.2 and 4.3. Note however that our periodic SARIMA model does

not provide estimates of separate components. Since the periodic growth rates, βs,

are included in the state vector, they are effectively marginalised out of the likelihood

function. Although the remaining number of 72 parameters is large, it should be em-

phasised that we have 60 years of monthly data. Empirically, it turns out to be quite

feasible to estimate this model.

Figure 4.3 gives a graphical presentation of the parameter estimates of the PSARIMA

model for U.S. unemployment series with two standard errors bands. We see signifi-

cant fluctuations in parameter estimates across the different months of the year which

suggests that a non-periodic SARIMA specification is implausibly restrictive.

4.4.3 Periodic UC model for U.S. unemployment series

We have shown that U.S. unemployment series is subject to periodic dynamics and

therefore we continue our analysis by considering a PUC model explained Chapter 2,

consisting of trend, season and cycle components. The disturbance variances of the

Page 97: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.4. EMPIRICAL ILLUSTRATION: U.S. UNEMPLOYMENT 89

0 2 4 6 8 10 12

1

2 φ1,s

0 2 4 6 8 10 12

0

1φ2,s

0 2 4 6 8 10 12−1

0

1θ1,s

0 2 4 6 8 10 12

−1.0

−0.5

0.0Θ1,s

0 2 4 6 8 10 12

0.025

0.050

0.075σε ,s 1948−1981

0 2 4 6 8 10 12

0.025

0.050

0.075σε ,s 1982−2007

Figure 4.3: Estimated parameters for PSARIMA model as in equations (4.22)-(4.23) for log

U.S. unemployment series with s = 1, . . . , 12. Top: φ1,s, φ2,s. Middle: θ1,s, Θ1,s. Bottom:

σε,1948−1981,s, σε,1982−2007,s. All estimated parameters are plotted with ± 2 standard errors.

trend, season, cycle and irregular components are all periodic. The PUC model for the

empirical section is given by

yt = µt + γt + ψt + εt, t = 1, . . . , n, (4.24)

with the trend (µt) and season (γt) are defined as follows,

µt+1 = µt + βt + ηt, ηt ∼ NID(0, σ2η,s),

βt+1 = βt + ζt, ζt ∼ NID(0, σ2ζ,s),

SS(L)γt+1 = ωt, ωt ∼ NID(0, σ2ω,s),

(4.25)

where SS(L) is the summation operator defined as SS(L) = 1 + L + · · · + LS−1. The

cycle component (ψt) is defined as,(ψt+1

ψ+t+1

)= ρs

(cosλ sinλ

− sinλ cosλ

)(ψt

ψ+t

)+

(κt

κ+t

), κt, κ

+t ∼ NID(0, σ2

κ,τ,s),

(4.26)

with variances σ2κ,I,s in January 1948 to December 1981 and σ2

κ,II,s from January 1982

onwards following in Sensier and van Dijk (2004). A restriction on the damping terms

0 <∏S

s=1 ρs < 1 is imposed to ensure that the stochastic process ψt is stationary. Fur-

ther we assume that the irregular term (εt) has variances σ2ε,s and that all disturbances

are mutually uncorrelated sequences from a Gaussian density for s = 1, . . . , S.

Page 98: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

90 CHAPTER 4. PERIODIC SARIMA MODELS

The above model is cast into a state space form containing 7S + 1 unknown pa-

rameters, ση,s, σζ,s, σω,s, σκ,I,s, σκ,II,s, σε,s, ρs and λ for s = 1, . . . , S. The parameters

are estimated by numerically maximising the exact log-likelihood. In our case with

S = 12, we originally estimate 85 parameters. Since the parameters ση,s, σζ,s, and σε,s

are mostly estimated as nearly zeroes, we fixed them at zero for all seasons except for

σζ,9 and the number of free parameters then becomes 50. The estimated components,

associated parameters, and their interpretation are similar as presented in Chapter 2.

4.4.4 Estimation and forecast results

In our analysis of monthly postwar U.S. unemployment series, we discover periodicity

in most parameters also in the seasonal ones. Figure 1 clearly indicates periodicity

of the autocorrelation function. To fit the U.S. unemployment monthly series, we

use the PSARIMA model as in (4.22)-(4.23) and the (periodic) UC model as in (4.24)-

(4.26). For the non-periodic SARIMA model we fit the time series with several SARIMA

models starting with AR order one until 13. We choose the model with smallest value

for Akaike information criterion (AIC) and AIC with finite sample correction (AICc)

which corresponds to an unrestricted SARIMA(3,0,1)(0,1,1)12 as our final non-periodic

SARIMA model.

Table 4.1 reports the log-likelihood value at the parameter estimate, AIC, AICc,

and some diagnostic tests for the one-step-ahead prediction errors. The Akaike criteria

indicate to choose the periodic models in favour of their non-periodic counterparts, but

the corrected Akaike criteria favours the non-periodic SARIMA over the PSARIMA

model. By introducing the variance moderation in 1982, we eliminate heteroskedasticity

and non-normality problems in the prediction errors which as can be concluded from

the N and H statistics: the normality N statistic is based on the sample skewness and

kurtosis as described in Bowman and Shenton (1975); the heteroskedasticity H statistic

is the classical test of Goldfeld and Quandt (1965). The reported nominal p-values are

indicative as they do not account for the variance moderation.

Comparing the results of SARIMA and UC models in general is a non-standard

exercise. Since the models are not nested, we cannot directly compare the log-likelihood

values for the estimated parameters. Taking the AIC(c) as an indicative goodness-of-

fit criterion, it appears that the PUC model outperforms the non-periodic UC model

since it has the lowest AIC(c) values, but the PSARIMA model is not superior to

its non-periodic version, see Table 4.1. However, the differences between SARIMA

model and non-parsimonious PSARIMA model are not statistically significant at normal

significance levels, both measured using in-sample fit as in Table 4.1 (which could favour

extensive models) and using out-of-sample forecasting accuracy as in Table 4.2 (which

could favour parsimonious models). Note that all four models presented in this chapter

Page 99: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.4. EMPIRICAL ILLUSTRATION: U.S. UNEMPLOYMENT 91

Table 4.1: Estimation results for models with variance moderation in 1982

SARIMA PSARIMA UC PUC

(3,0,1)(0,1,1)S (2,0,1)(0,1,1)†S

logL 1157.59 1226.52 1182.88 1238.23

AIC -2299.18 -2309.04 -2353.76 -2376.46

AICc -2298.98 -2292.79 -2353.64 -2368.83

N 2.6769 5.3150 2.5656 4.6877

(p-value) (0.262) (0.070) (0.277) (0.096)

H 0.9117 0.9088 0.9585 0.9923

(p-value) (0.381) (0.364) (0.687) (0.941)

p 8 72 6 50‡

n∗S 720 720 720 720

Notes: Data sample used for estimation purpose is from January 1948 until December 2007. PSARIMA

= periodic SARIMA; PUC = periodic UC. † : the cross term θ1,sΘ1,sεt−(S+1) of PSARIMA model is

omitted. logL = log-likelihood. AIC = Akaike Information Criterion. AICc = AIC with finite sample

correction. N is a normality test on the prediction errors; Na∼ χ2(2); H is a heteroskedasticity test

on the prediction errors; Ha∼ F (n∗S/2, n∗S/2); p is the number of parameters. ‡ : the number of

parameters in PUC model is calculated as 49 + 12 (cycle variance moderation parameters σκ,II,s) −11 (zeros in σζ,s). n

∗S is the number data points with n∗ = 60 and S = 12. The lowest AIC and AICc

between periodic and non-periodic version for both type of models are printed in boldface.

Page 100: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

92 CHAPTER 4. PERIODIC SARIMA MODELS

have satisfactory residual diagnostics.

Based on the parameter estimates up to December 2000, the PUC model has the

best one-step-ahead out-of-sample forecasting performance from January 2001 until

December 2008 according the overall RMSE. Using the forecast accuracy test of non-

nested models by Diebold and Mariano (1995), we find insignificant differences between

the forecasts generated by the (P)UC and (P)SARIMA models, see Table 4.2. S1,3 is

the Diebold-Mariano (DM) test applied to UC and SARIMA model, S1,4 is DM test

applied to UC and PSARIMA model, S2,3 is DM test applied to PUC and SARIMA

model, S2,4 is DM test applied to PUC and PSARIMA model, and S3,4 is DM test

applied to SARIMA and PSARIMA model. None of the overall tests with the DM

procedure is significant. As the size of the Diebold-Mariano test might be difficult to

control, see e.g. Harvey et al. (1998) and McElroy and Findley (2010), and the forecast

error variance estimates are only based on 8 years, the results have to be interpreted

with care. Overall, the differences seem insignificant.

The optimal model is likely to be a more parsimonious PSARIMA specification

which could also be estimated using the algorithm of Section 4.2. More parsimonious

PSARIMA models can be formulated in different ways, for example different models

with equal parameters can be specified for subsets of months, see Penzer and Tripodis

(2007) with their application for economic time series. Alternatively, SARIMA models

with periodic parameters as smooth functions of the month-of-the-year can be consid-

ered, see Anderson and Vecchia (1993) with their application for geophysical time series.

These examples of parsimonious PSARIMA model specifications fall outside the scope

of this chapter. The empirical example in this chapter illustrates the applicability of our

technique, which enables comparisons between parsimonious specifications against the

unrestricted periodic model using exact maximum likelihood, even when the number of

periods is comparatively large.

4.5 Summary and conclusion

The primary aim of this chapter is to develop estimation methods for SARIMA models

with periodically varying parameters which does not require a-priori differencing of the

data. We propose convenient state space representations of the PSARIMA models to

facilitate model identification, specification and exact maximum likelihood estimation of

the periodic parameters. A key development in this chapter is our method for computing

the variance-covariance matrix of the initial set of observations which is required for

exact maximum likelihood estimation. We illustrate the use of PSARIMA models to

fit and forecast a monthly post-war U.S. unemployment time series.

Page 101: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

4.5. SUMMARY AND CONCLUSION 93T

able

4.2:

Accuracy

test

foron

e-step-ahead

out-of-sam

pleforecastingbetweenperiodic

andnon

-periodicmodelsforJan

uary20

01

untilDecem

ber20

08

RM

SE

Die

bol

d-M

aria

no

test

UC

PU

CSA

RIM

AP

SA

RIM

AS1,3

S1,4

S2,3

S2,4

S3,4

(1)

(2)

(3)

(4)

Jan

0.04

800.

0470

0.04

300.0

353

1.17

911.

5167

0.73

770.

9767

1.08

31

Feb

0.02

630.

0335

0.0

250

0.04

210.

2927

-1.3

769

1.69

54-0

.763

1-1

.348

4

Mar

0.03

380.0

281

0.03

510.

0364

-0.6

578

-0.3

568

-1.0

104

-0.7

127

-0.1

862

Apr

0.02

980.0

291

0.03

170.

0583

-0.3

425

-2.0

051

-0.3

235

-2.3

171

-2.0

815

May

0.04

490.

0481

0.0

419

0.05

250.

8274

-1.1

569

1.55

26-0

.680

0-1

.128

7

Jun

0.02

880.

0390

0.0

203

0.03

291.

0426

-0.5

327

1.63

050.

6638

-1.2

623

Jul

0.02

430.

0329

0.0

223

0.02

841.

4962

-1.0

690

1.54

640.

5131

-1.5

135

Aug

0.03

710.0

315

0.03

490.

0389

0.49

56-0

.285

4-0

.466

8-0

.787

7-1

.421

1

Sep

0.02

540.

0278

0.0

227

0.02

541.

3941

-0.0

099

0.76

320.

4095

-0.6

869

Oct

0.03

690.0

239

0.04

030.

0403

-1.8

396

-0.9

174

-1.6

922

-1.6

799

-0.0

058

Nov

0.02

710.0

216

0.02

460.

0321

1.53

08-0

.769

1-0

.441

3-1

.638

8-1

.133

6

Dec

0.04

020.0

338

0.04

190.

0465

-0.3

039

-0.7

531

-1.7

605

-1.6

374

-1.3

017

Ove

rall

test

0.03

440.0

330

0.03

390.

0402

0.35

93-0

.630

40.

1309

-0.6

240

-0.8

367

Note

s:D

ata

sam

ple

use

dfo

rp

aram

eter

ses

tim

atio

nis

from

Janu

ary

1948

unti

lD

ecem

ber

2000.Si,j

are

the

Die

bold

-Mari

an

o(D

M)

test

sfo

req

ual

one-

step

-ah

ead

MS

Efo

reca

stac

cura

cyof

two

non

-nes

ted

mod

els

wh

ere

asy

mp

toti

call

yea

chSi,j∼N

(0,1

).T

he

crit

ical

valu

eat

5%

sign

ifica

nce

leve

l

for

the

DM

test

isgi

ven

at±

1.96

.S

ign

ifica

nt

neg

ativ

e(p

osi

tive)

valu

efo

rSi,j

ind

icate

sth

at

fore

cast

sgen

erate

dby

the

firs

tm

od

elare

more

(les

s)

accu

rate

than

the

fore

cast

sfr

omse

con

dm

od

el.S1,3

isD

M-t

est

ap

pli

edto

UC

and

SA

RIM

Am

od

el,S1,4

isD

Mte

stap

plied

toU

Can

dP

SA

RIM

A

mod

el,S2,3

isD

M-t

est

app

lied

toP

UC

and

SA

RIM

Am

od

el,S2,4

isD

M-t

est

ap

pli

edto

PU

Can

dP

SA

RIM

Am

od

el,

an

dS3,4

isD

M-t

est

ap

pli

edto

SA

RIM

Aan

dP

SA

RIM

Am

od

el.

Non

eof

the

over

all

test

sw

ith

the

DM

pro

ced

ure

issi

gn

ifica

nt.

RM

SE

=R

oot

Mea

nS

qu

are

Err

or.

Low

est

RM

SE

for

the

fou

rm

od

els

and

sign

ifica

nt

valu

efo

rD

M-t

est

are

give

nin

bold

face

.

Page 102: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

94 CHAPTER 4. PERIODIC SARIMA MODELS

4.6 Appendix

Periodic SARIMA(2,0,1)(0,1,1)S

For s = 1, . . . , S, the state vector αt of PSARIMA(2,0,1)(0,1,1)S model is defined as

αt = (β1 . . . βS yt−1 . . . yt−S y†t φ2,sy

†t−1 + θ1,sεt + Θ1,sεt−S+1 Θ1,sεt−S+2 . . .Θ1,sεt)

′,

with corresponding disturbance vector given by

Htεt =(

01×2S εt+1 θ1,sεt+1 01×(S−2) Θ1,sεt+1

)′.

The transition matrix Tt is therefore (3S + 1)× (3S + 1) and Tt and Zt are given by

Tt =

Ta 0S×(2S+1)

Tb Tc

0(S+1)×S Td

, Zt = first row of[Tb Tc

], (4.27)

Ta =

[0 IS−1

1 01×(S−1)

], Tc =

[01×(S−1) 1 1 01×S

IS−1 0(S−1)×1 0(S−1)×1 0(S−1)×S

],

Tb =

[1 01×(S−1)

0(S−1)×1 0(S−1)×(S−1)

], Td =

02×S

φ1,s

φ2,s

I2 02×S−2

0(S−2)×S 0(S−2)×1 0(S−2)×2 I(S−2)

01×S 0 01×2 01×(S−2)

.where 0r×c denotes a zero matrix with r rows and c columns and where Ir denotes an

identity matrix of dimension r.

The variance-covariance matrix of the state disturbances is given by

HtH′t =

0 01×2S 01×(S+1)

02S×1 02S×2S 02S×(S+1)

0(S+1)×1 0(S+1)×2S H∗tH∗ ′t

, (4.28)

where

H∗tH∗ ′t =

σ2ε,s θ1,sσ

2ε,s 01×(S−2) Θ1,sσ

2ε,s

θ1,sσ2ε,s θ21,sσ

2ε,s 01×(S−2) θ1,sΘ1,sσ

2ε,s

0(S−2)×1 0(S−2)×1 0(S−2)×(S−2) 0(S−2)×1

Θ1,sσ2ε,s θ1,sΘ1,sσ

2ε,s 01×(S−2) Θ2

1,sσ2ε,s

. (4.29)

whereas Gt = 0. Finally, a = E(α1) = 0(3S+1)×1 and the variance matrix of the initial

state is given by

P =

[κI2S 02S×(S+1)

0(S+1)×2S P †

], (4.30)

where the initial variance matrices of the growth rates βt and the levels yt are diffuse:

κ → ∞. The variance matrix of the stationary elements of the state, P †, can be

constructed by a pre-run of the Kalman filter as explained in Section 4.3.

Page 103: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Chapter 5

Frequency-specific trigonometric

seasonality

This chapter is based on Hindrayanto, Aston, Koopman and Ooms (2010). We thank

David F. Findley, William R. Bell and Tucker S. McElroy from the U.S. Census Bureau

and John A.D. Aston from University of Warwick (UK) for their generous help and

guidance.

5.1 Introduction

The Airline model popularised by Box and Jenkins (1970) and the basic structural

model (BSM) popularised by Harvey (1989) are amongst the most widely used models

for seasonal adjustment. Their popularity can be attributed to their simplicity and

accuracy for a wide range of seasonal time series. However, the simplicity of both models

inevitably implies that there will be a substantial number of practical cases where either

model is inadequate. In this chapter, we consider a similar generalisation for both

seasonal specifications. We aim to investigate the dynamic properties of the frequency

specific basic structural model (FS-BSM) and to relate the model to a frequency specific

version of the Airline model developed recently at the U.S. Census Bureau, as in Aston

et al. (2007).

The BSM belongs to the class of unobserved component (UC) time series models that

decompose time series into trend, seasonal and irregular components. Here we focus on

the trigonometric seasonal component representation. To address some of the criticism

that the BSM is too restrictive to fit seasonal time series adequately, we modify the BSM

to be less restrictive in the specification of the seasonal component. Instead of having a

single seasonal variance for all frequencies, we let the time-varying trigonometric terms

associated with different seasonal frequencies have different variances. Therefore we

develop the FS-BSMs that are more flexible than the standard BSM while still capable

Page 104: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

96 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

of producing component estimates. The extended set of parameters can be estimated

by using maximum likelihood procedures based on the Kalman filter.

To illustrate that less restrictive models can be needed to fit seasonal time series, we

plot two out of 75 monthly strongly seasonal time series provided by the U.S. Census

Bureau in Figure 5.1. The first series with code X41140 is one of the Foreign Trade series

which corresponds to the Final Export of Musical Instruments from January 1989 to

November 2001. The second series has code U37AVS which corresponds to Shipments of

Household Furniture and Kitchen Cabinet from January 1992 to September 2001. This

series comes from the Census Bureau’s monthly Manufacturers’ Shipments, Inventories

and Orders Survey. We difference both time series yt into ∆∆12yt = yt− yt−1− yt−12 +

yt−13 after taking the natural logarithm. We also plot the autocorrelation functions

(ACF) of the differenced series. We observe that for X41140, the only significant ACF

are at lag 1, 11, 12 and 13, while for U37AVS lags 2, 3, 5 and 6 are also significant.

Comparing the characteristics of these two time series, we deduce that a frequency

specific (FS) model is needed to fit U37AVS while a basic (non-FS) model may suffice

for X41140. Details of the estimates and test results for the two time series are given

in Section 5.4.

1990 1995 2000

17.75

18.00

18.25

18.50 Log(X41140)

0 5 10 15 20 25

−0.5

0.0

0.5ACF−DD12(Log(X41140))

1995 2000

7.50

7.75

8.00

Log(U37AVS)

0 5 10 15 20 25

−0.5

0.0

0.5ACF−DD12(Log(U37AVS))

Figure 5.1: Motivating example. Top: Final Export of Musical Instruments series (X41140)

and bottom: Manufacturers’ Shipment on Household Furniture and Kitchen Cabinet series

(U37AVS). The first column shows the series in natural logarithm and the second column

depicts the ACF of the differenced series. X41140 does not require an FS model while U37AVS

does require an FS model as the ACF shows more periodicities.

Page 105: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.2. FREQUENCY SPECIFIC BASIC STRUCTURAL MODEL 97

The procedure of ARIMA model based seasonal adjustment dates back to the early

1980s, see Burman (1980) and Hillmer and Tiao (1982), but the automatic implemen-

tation in seasonal adjustment software was carried out more than a decade later, see

the documentation of SEATS from Gomez and Maravall (1996). Using this software,

the Airline model is frequently chosen to identify seasonal time series. In the search

for a useful alternative for the Airline model, Aston et al. (2007) introduce extensions

to the standard Airline model by means of decomposing and reparametrising the sea-

sonal moving average (MA) factor and partitioning the factors of different seasonal

frequencies into two groups, each with its own coefficient. Because of the dependency

of its parameters on the seasonal frequencies, the new model is called the frequency

specific Airline model (FS-AM). In the empirical part of their research, they show that

the FS models are preferred above the standard Airline model for 22 series out of the

75 selected U.S Census Bureau economic indicator series that we also examine here.

The simple Airline model was known to be adequate model for these 75 time series

compared to other non-FS SARIMA models. The comparison between FS and non-FS

Airline model is based on the Minimum Akaike’s Information Criterion (MAIC) and a

modification of it, the so-called F -MAIC, see also Aston et al (2004, 2007).

The remainder of this chapter is organised as follows. Section 5.2 provides details of

FS-BSM and its variations. Section 5.3 explores the dynamic properties of the grouped

FS-BSMs and relates them to those of the grouped FS-AMs. We also define a quadratic

distance metric to measure the difference in the dynamic properties of FS-AM versus

FS-BSM specifications. Further we carry out a simulation study to assess the goodness-

of-fit performance for both models. In Section 5.4 we investigate empirically whether

FS models lead to more similar decompositions compared to non-FS models. For this

purpose, we consider a U.S. Census Bureau database of seasonal time series that has

been analysed previously with the FS-AMs of Aston et al. (2007). It is shown that the

FS models are in general closer to each other than their non-FS variants. Conclusion is

given in Section 5.5 while appendices provide additional information on the derivations

of formulae.

5.2 Frequency specific basic structural model

In this section we give details of FS-BSM and its variations, starting from the most

general form of FS-BSM to the grouped FS-BSMs. Let the time series observation yt

at time t be modelled as the sum of the trend µt, the seasonal term γt and the irregular

disturbance εt. The FS-BSM is then given by,

yt = µt + γt + εt, εt ∼ NID(0, σ2ε ), (5.1)

Page 106: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

98 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

for t = 1, . . . , n, where the trend µt is specified as

µt+1 = µt + βt + ηt, ηt ∼ NID(0, σ2η), (5.2)

βt+1 = βt + ζt, ζt ∼ NID(0, σ2ζ ), (5.3)

and the seasonal component γt follows the trigonometric specification as given by

γt =

s/2∑j=1

γj,t, (5.4)(γj,t+1

γ∗j,t+1

)=

(cosλj sinλj

− sinλj cosλj

)(γj,t

γ∗j,t

)+

(ωj,t

ω∗j,t

), (5.5)(

ωj,t

ω∗j,t

)∼ NID

[(0

0

), σ2

ω,jI2

],

with λj = 2πj/s as the j-th seasonal frequency, j = 1, . . . s/2, and I2 is the 2 × 2

identity matrix. We assume s is even to simplify notation. The extension to odd s does

not add new insights. In monthly time series (s = 12), equation (5.5) implies a seasonal

component with six different variances. Further it is assumed that all disturbances in

the model are mutually and serially uncorrelated at all leads and lags. The standard

BSM is obtained when all seasonal variances σ2ω,j are equal, that is σ2

ω,j = σ2ω for all

j = 1, . . . , s/2. More details and properties of the BSM are given by Harvey (1989).

Note that we have s = 12 in all applications in this chapter.

Together with the two parameters associated with the trend (level and slope dis-

turbance variances) and the variance of the irregular term, we have s/2 + 3 unknown

parameters for the unrestricted model. Thus, for monthly time series data, we have

nine parameters that need to be estimated. Instead of estimating all parameters, which

is inherently difficult in short time series as several variances can be estimated as zero,

we analyse grouped FS-BSMs where the seasonal variances are restricted in different

ways. In our grouped FS-BSMs, the seasonal variance parameters are divided into two

subsets where all variances in one group have the same value. This limits the number

of free seasonal parameters to two (one for each group), such that the total number of

parameters for monthly series is only five. At the same time, this approach extends

the number submodels from one type of unrestricted FS-BSM with nine parameters

to 31 types of monthly FS-BSMs with five parameters, as shown in Table 5.6 of the

last appendix. The standard BSM can be seen as a degenerate FS-BSM with only four

parameters.

We introduce the following notation to distinguish different types of FS-BSM for

monthly data. As there are s/2 frequency specific parameters in general, we consider

s/4 types of five-coefficient models with two groups of seasonal frequencies. This means

Page 107: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.3. PROPERTIES OF FS-BSM AND MEASURING DISTANCE TO FS-AM 99

that we consider three types of two-group FS-BSMs for monthly data. Models of the

first type are denoted as FS-BSM(5, 2, {i/s}) where the first entry denotes the total

amount of parameters in the model, the second entry denotes the number of groups in

the seasonal frequencies, and the third entry denotes the single frequency i/s which is

separated from the remaining s/2− 1 frequencies with i = 1, . . . , s/2. For example, the

FS-BSM(5, 2, {3/12}) imposes

σ2ω,3 = σ2

ω,I and σ2ω,1 = σ2

ω,2 = σ2ω,4 = σ2

ω,5 = σ2ω,6 = σ2

ω,II,

where σ2ω,I and σ2

ω,II represent the variance parameters of the two different groups.

The second type of models are denoted as FS-BSM(5, 2, {i, j}/s) in which pair of

seasonal frequencies i/s and j/s form one group and the remaining frequencies form

another group where i < j and i, j = 1, . . . , s/2. The FS-BSM(5, 2, {1, 2}/12) for

example has the restrictions

σ2ω,1 = σ2

ω,2 = σ2ω,I and σ2

ω,3 = · · · = σ2ω,6 = σ2

ω,II.

The third type of models that we consider here, denoted as FS-BSM(5, 2, {i, j, k}/s),have parameters associated with three seasonal frequencies i/s, j/s and k/s in one

group and the remaining three frequencies in the other group where i < j < k and

i, j, k = 1, . . . , s/2. An example is the FS-BSM(5, 2, {1, 2, 3}/12) in which we impose

the restrictions

σ2ω,1 = σ2

ω,2 = σ2ω,3 = σ2

ω,I and σ2ω,4 = σ2

ω,5 = σ2ω,6 = σ2

ω,II.

The state space representation for all FS-BSMs is a straightforward generalisation of the

representation in Harvey (1989). The standard BSM is equivalent to FS-BSM(4, 1, {∅})since there are four parameters in the model with a single variance for all seasonal

frequencies, that is

σ2ω,1 = · · · = σ2

ω,s/2 = σ2ω.

The full or unrestricted FS-BSM with six seasonal parameters for monthly data is the

most general version and could be denoted as FS-BSM(9, 6, {1, 2, 3, 4, 5, 6}/12) since

there are nine parameters in total and the six seasonal parameters each have their own

frequencies. Maximum likelihood estimation of the parameters is based on the Kalman

filter using the functions in the SsfPack library of Koopman et al. (2008).

5.3 Properties of FS-BSM and measuring distance

to FS-AM

This section gives details on the properties of FS-BSMs and measuring distance to FS-

AM. Subsection 5.3.1 derives the stationary form of yt and the seasonal component γt in

Page 108: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

100 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

FS-BSM to compute the theoretical ACF of ∆∆syt and S(L)γt. Details on derivations

are given in Appendix 5.A. Subsection 5.3.2 discusses the theoretical and sample ACF

of the reduced FS-BSMs which are computed using stationary MA(s + 1)-form of FS-

BSM, see Appendix 5.B for mathematical formulation. In subsection 5.3.3 we introduce

a distance metric based on the MA(s + 1) coefficients of both FS models in order to

investigate how close FS-BSMs and FS-AMs are to each other. In subsection 5.3.4 we

carry out a small-scale simulation study to assess the goodness-of-fit performance for

both models using the distance metric as explained in the previous subsection.

5.3.1 Stationary form of FS-BSM

In this subsection we derive the stationary for of γt and yt in order to compute the

ACF of S(L)γt and ∆∆syt. We start with an alternative specification of the seasonal

component (5.4) which is given by

γt =

s/2∑j=1

γj,t =

s/2−1∑j=1

(1− cosλjL)ωj,t−1 + (sinλjL)ω∗j,t−11− 2 cosλjL+ L2

+ωs/2,t−1(1 + L)

, (5.6)

for s is even, see Harvey (1989) for more details. Bell (1993) emphasised that the

trigonometric specification (5.6) is a sum of ARIMA components. The numerator of

(5.6) is an MA(1) process for each seasonal frequency so that we can rewrite the above

equation as

γt =

s/2∑j=1

(1− αjL)wj,tδj(L)

, (5.7)

where wj,t ∼ NID(0, σ2j ) and δj(L) = 1 − 2 cosλjL + L2 for each seasonal frequency

j = 1, . . . , s/2. Moreover, the above equation also implies that

δj(L)γj,t = (1− αjL)wj,t, (5.8)

for each individual seasonal component. The values of the MA(1) coefficients, αj, can

be obtained straightforwardly and need not be estimated. The variances of wj,t are

given by σ2j = 2σ2

ω,j/(1 + α2j ) for j = 1, . . . , s/2 − 1, and for j = s/2 the variance is

given by σ2s/2 = σ2

ω,s/2, where σ2ω,j denote the innovation variance of the model for γj,t

in (5.5), see Appendix 5.A for more details.

Bell (2004) reported the values of αj and the relationship between σ2j and σ2

ω,j for

monthly data with s = 12. Appendix 5.A also gives a full treatment for the analytical

derivation of the αj and σ2j for completeness. Furthermore let

s/2∏j=1

δj(L) = S(L) = 1 + L+ L2 + · · ·+ Ls−1 (5.9)

Page 109: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.3. PROPERTIES OF FS-BSM AND MEASURING DISTANCE TO FS-AM 101

be the seasonal ‘summation’ operator which makes S(L)γt stationary. Then, we have

the following equation

S(L)γt =

s/2∑j=1

S(L)γj,t =

s/2∑j=1

S(L)

δj(L)(1− αjL)wj,t =

s/2∑j=1

∏i 6=j

δi(L)(1− αjL)wj,t, (5.10)

where the right hand side is the sum of s/2 independent MA(s − 2) processes since

there is no power in the polynomial∏

i 6=j δi(L)(1 − αjL) that is higher than s − 2 for

each seasonal frequency j, which is by itself also an MA(s− 2) process, see also Harvey

(1989, § 2.4.3).

The FS-BSMs and FS-AMs can be most conveniently related by expressing both

models in their stationary forms. For the FS-BSM, we specify the trend µt as in (5.2)-

(5.3) by

µt =ηt−1∆

+ζt−2∆2

, (5.11)

with ∆m = (1−L)m. The stationary form of the seasonal component is given by (5.10).

Substituting equations (5.10) and (5.11) into the FS-BSM (5.1) yields

yt =ηt−1∆

+ζt−2∆2

+

∑s/2j=1

∏i 6=j δi(L)(1− αjL)wj,t

S(L)+ εt. (5.12)

The minimum order of differencing for yt is then given by

∆2S(L) = (1− L)2(1 + L+ L2 + · · ·+ Ls−1) = (1− L)(1− Ls) = ∆∆s, (5.13)

with ∆m = 1− Lm and the stationary reduced form of yt in the FS-BSM becomes,

∆∆syt = ∆sηt−1 + S(L)ζt−2 + ∆2

s/2∑j=1

∏i 6=j

δi(L)(1− αjL)wj,t + ∆∆sεt, (5.14)

where the αj are known as exemplified in Table 5.5 in Appendix 5.A for s = 12.

Since the highest polynomial order in the above equation is s+1, ∆∆syt in (5.14) is

defined as an MA(s+ 1) process with at most 3 + s/2 parameters to estimate when no

restrictions are imposed on the parameters of the FS-BSM. When the restriction that

all seasonal innovation variances, σ2ω,j, have the same value for j = 1, . . . , s/2 applies,

the number of parameters will be reduced to four (independent of s). In the case of a

grouped FS-BSM, there will be five unknown parameters where three are non-seasonal

and two are seasonal parameters.

Page 110: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

102 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

5.3.2 Dynamic properties of a FS-BSM

As an alternative for grouped FS-BSMs we consider grouped FS-AMs as in Aston et al.

(2007). The unrestricted form for monthly series is given by

∆∆12yt = (1− aL− bL2)[(1 + c6L)

∏5i=1

(1− 2ci cos(2πi/12)L+ c2iL

2)]εt.

We provide summary of FS-AMs in Appendix 5.C. The FS-AMs for monthly data

allow an MA(13) representation for ∆∆12yt. The 4-coefficient FS-AMs denoted 4 −5− 1{i}, 4− 4− 2{i, j}, and 4− 3− 3{i, j, k} have two parameters for the trend-cycle

component and two parameters for the frequency specific seasonal component next to

an overall variance parameter σ2ε . They are comparable to the FS-BSMs with two

groups of seasonal variances in terms of parsimony. We compare two of these models in

a Monte Carlo study in Section 5.4. For an econometric approach to frequency specific

seasonal AR models, we refer to the paper of Hylleberg et al. (1990).

Figure 5.2 shows the characteristics of two simulated monthly time series with a

length of 30 years. The first series is based on the BSM and the second series is based

on FS-BSM(5,2,{2,3,5}/12). The data generating process (DGP) for the BSM is given

by the following parameters: ση = 0.02, σζ = 0.0002, σω = 0.004 and σε = 0.03, while

the DGP of FS-BSM(5,2,{2,3,5}/12) is given by ση = 0.02, σζ = 0.0002, σω,I = σω,2 =

σω,3 = σω,5 = 0.002, σω,II = σω,1 = σω,4 = σω,6 = 0.006 and σε = 0.03.

The sample ACF of ∆∆12yt (third column in Figure 5.2) indicates an MA(13) process

as the ACF at lags greater than 13 become insignificant. The theoretical ACF based

on the DGP (fourth column in the same figure) illustrates this fact. Appendix 5.B

gives details in the mathematical formulation to compute the theoretical autocovariance

function (ACVF) from where we can derive the ACF. Note that the ACVF of ∆∆12yt

in (5.14) for s = 12 equals the ACVF of a restricted MA(13) process. The non-zero

elements of the ACVF are given by 14 equations

γ(k) = Cov(∆∆12yt,∆∆12yt−k), (5.15)

for k = 0, . . . , 13. Once we estimate the parameters of FS-BSM, the ACVF can be

derived numerically by the method of Appendix 5.B. The 14 ACVF moment equa-

tions over-identify the five parameters of the model. Given the ACVF of the FS-BSM,

we obtain its reduced form MA(13) representation numerically using the method of

Tunnicliffe-Wilson (1969). Note that for the monthly FS-AMs, we obtain the MA(13)

coefficients simply by multiplying the lag polynomials in the model.

Looking at the sample ACF of the first simulated series (based on BSM), there are

significant peaks at lag 1, 10 (maybe), 12 and 13 while the remaining auto-correlations

are almost zero. The theoretical auto-correlations show peaks at lag 1, 11, 12 and 13.

This gives a link between the BSM and the Airline model as the latter is actually a

Page 111: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.3. PROPERTIES OF FS-BSM AND MEASURING DISTANCE TO FS-AM 103

restricted MA(13) process with zero ACF in lags 2 until 10 and non-zero ACF at lag 1,

11, 12 and 13. For the second simulated series based on FS-BSM(5,2,{2,3,5}/12), the

sample and theoretical ACF are more complicated since some values between lag 2 and

11 are clearly significant, see the third and fourth columns on the bottom row of Figure

5.2. This shows that the reduced form FS-BSM(5,2,{2,3,5}/12) is a less restrictive

MA(13) process which gives the connection between FS-BSMs and FS-AMs in general.

How close these models are to each other will be investigated in the next subsection.

5.3.3 Distance between FS models using MA coefficients

It is well-known that MA processes are fully identified by their autocovariance functions

(ACVF) and we will use this fact to look for the similarities between FS-BSMs and the

FS-AMs. For the comparison between FS-BSM and FS-AM, we use the five parameters

case from both models as an example.

To compute how close these models are to each other, we define the following dis-

tance metric,

D =

√√√√ 13∑i=1

[θ∗i − θi]2, (5.16)

where θ∗i are the estimated MA coefficients of the FS-AMs and θi are the MA coefficient

resulting from the reduced form FS-BSMs for i = 1, . . . , 13. Note that the MA(13) coef-

ficients from the reduced form FS-BSM are computed numerically given the theoretical

ACVF and using the method of Tunnicliffe-Wilson (1969). When the value of D is

small, we regard the models to be close to each other since the MA(13) coefficients

from both models are then similar. Since the original Airline model and the standard

BSM are special cases of the FS-AMs and FS-BSMs respectively, the distance metric

D can also be applied straightforwardly to these models.

In addition to the distance metric D, we also define the relative distance metric,

RD =DnonFS −DFS

DnonFS

, (5.17)

where DnonFS is the distance metric between the MA(13) representation of the reduced

form BSM and the Airline model (non FS models), while DFS is the distance metric

for each type of FS models. A positive (negative) relative distance metric indicates

how much the similarity between the FS versions of the Airline model and the BSM

increases (decreases) compared to the similarity between their non-FS counterpart.

Page 112: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

104 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

1020

30

8.0

8.5

Y_t =

BS

M

1020

30−

0.2

−0.1

0.0

0.1

0.2dd_12(Y

_t)

05

10

−0.50

−0.25

0.00

0.25

0.50S

ample A

CF

−dd_12(Y

_t)

05

10

−0.50

−0.25

0.00

0.25

0.50T

heoretical AC

F

1020

30

8.0

8.5

Y_t =

FS

−B

SM

(5,3,{2,3,5}/12)

1020

30

−0.2

−0.1

0.0

0.1

0.2dd_12(Y

_t)

05

10

−0.50

−0.25

0.00

0.25

0.50S

ample A

CF

−dd_12(Y

_t)

05

10

−0.50

−0.25

0.00

0.25

0.50T

heoretical AC

F

Figu

re5.2:

(Top

row

)T

ime

seriesp

lots

ofyt

and

∆∆

12 yt ,

wh

ereyt

issim

ulated

mon

thly

seriesb

asedon

BS

Mover

alon

gp

eriod

(30years),

samp

leA

CF

of∆

∆12 yt ,

and

the

theoretica

lA

CF

with

the

followin

gD

GP

:ση

=0.02,

σζ

=0.0002,

σω

=0.004,

andσε

=0.03.

(Botto

m

row

)S

imilar

plo

tsb

asedon

FS

-BS

M(5,3,{2

,3,5}

/12)w

ithth

efollow

ing

DG

P:ση

=0.02,

σζ

=0.0002,

σω,I

=σω,1

=σω,4

=σω,6

=0.006,

σω,II

=σω,2

=σω,3

=σω,5

=0.0

02

andσε

=0.03.

Page 113: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 105

5.3.4 Monte-Carlo studies for frequency specific models

In this subsection, we report the results of a small-scale simulation study to obtain an

indication of the range of our distance metrics (5.16). The results are based on 1000

simulated time series with a length of 120 data points. First, we generate 1000 series

from a BSM with ση = 0.02, σζ = 0.0001, σω = 0.003 and σε = 0.03, and we fit

these series with both the BSM and the Airline model. Second, we generate another

1000 series from a FS-BSM(5,2,{1,4}/12) with ση = 0.02, σζ = 0.0001, σω,I = σω,1 =

σω,4 = 0.005, σω,II = σω,2 = σω,3 = σω,5 = σω,6 = 0.002 and σε = 0.03 and we fit these

series with the FS-BSM(5,2,{1,4}/12) and with the corresponding FS-AM 4-4-2(1,4).

Further we compute the MA(13) coefficients of the reduced form BSM and FS-BSM

and calculate the distance metric D for each time series.

Figure 5.3 presents the histogram of the distance metrics. It confirms our expecta-

tion that the FS models lie closer to each other than the non-FS models. The median

of the distance metric density in the FS models lies around 0.1 while for the non-FS

models it lies around 0.13. From both density plots, none of the distance metrics is

calculated as zero, which means that the UC type of models (BSM and FS-BSMs)

never display exactly the same fit as the ARIMA type of models (Airline model and

FS-AMs respectively), see also the evidence reported in Maravall (1985) and Harvey

(1989). However, the differences are often small. Moreover, in many cases the models

lead to similar time series decompositions into seasonal and non-seasonal component

estimates. Seasonal adjustment for FS-AMs is based on a canonical decomposition as

in Hillmer and Tiao (1982). Seasonal adjustment for FS-BSMs is based on minimum

mean squared error estimates of γt. We show examples of seasonal adjustment in our

empirical analysis.

5.4 Empirical results

In this section we report results of our empirical study into frequency specific seasonal

time series models for 75 U.S. Census Bureau monthly data. Time series plots on man-

ufacturing and foreign trade were shown in Figure 5.1 in the Introduction as examples.

We observe a strong presence of seasonal patterns in both series. The database of

75 time series consists of 36 monthly Manufacturers’ Shipments, Inventories and Or-

ders Survey data from January 1992 until September 2001, and 39 monthly Foreign

Trade series (Imports and Exports) from January 1989 through November 2001. The

database has been analysed previously with the Airline model and the FS-AMs by

Aston et al. (2007). In this chapter state space methods are adopted for parameter

estimation of related FS-BSMs by maximum likelihood. All calculations are carried out

by the object-oriented matrix programming environment Ox of Doornik (2006) and the

Page 114: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

106 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

0.00 0.05 0.10 0.15 0.20 0.25 0.30

5

10

15Distance between BSM and Airline model

0.00 0.05 0.10 0.15 0.20 0.25 0.30

5

10

15Distance between FS−BSM(5,2,{1,4}/12) and FS−AM 4−4−2(1,4)

Figure 5.3: Density and histogram of distance metric D (defined equation (5.16)) based on

simulation study with 1000 replications. (Top panel) Distance metric between the BSM and

Airline model for a BSM data generating process. (Bottom panel) Distance metric between

FS-BSM(5,2,{ 1,4 }/12) and FS-AM(4-4-2(1,4)) for a FS-BSM data generating process. The

dotted line is the estimated density as a smooth function through the histogram with Gaussian

kernel and an optimal bandwidth (set by OxMetrics, see Silverman (1986)).

Page 115: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 107

SsfPack library of Koopman et al. (2008).

5.4.1 Two examples comparing BSM and FS-BSM

We first analyse in more detail time series U37AVS (shipment of household furniture

and kitchen cabinets) and X41140 (export of music instruments) of the U.S. Census

database as presented in Figure 5.1. We estimate model (2.1)-(2.5) and add regression

variables to equation (2.1) to account for trading days, Easter effect and outlier effects.

The regression variables and the choice for logs versus no-logs are the same in Aston

et al. (2007). When effects are not significant, the regression variables are removed.

We impose different frequency specific restrictions and select FS-BSM(5,2,{4}/12) for

series U37AVS.

We present the parameter estimates of log(U37AVS) from both BSM and FS-

BSM(5,2,{4}/12) in Table 5.1. We notice that the AIC(c) of FS-BSM(5, 2, {4}/12)

is lower than the AIC(c) resulting from the BSM and that the average of the estimated

seasonal disturbance variances for the FS-BSM is significantly higher than the esti-

mated seasonal variance for the BSM. The decomposition of the time series into trend,

seasonal and irregular is presented in Figure 5.4. The components are estimated by the

Kalman filter applied to the state space form of model (2.1)-(2.5).

We apply the parametric Cramer-von Mises (CvM) seasonality test of Harvey (2001)

and Busetti and Harvey (2003) to determine whether the estimated seasonal parame-

ters are significantly larger than zero. This test is used since standard tests for σI = 0

or σII = 0 do not apply. The parametric test is based on transformations of the stan-

dardised one-step-ahead prediction errors of the estimated model, where the seasonal

variance is put to zero after estimation. Busetti and Harvey (2003) explained that for

monthly time series, the test statistics for the first five seasonal frequencies have two

degrees of freedom and the sixth frequency has only one degree of freedom. The total

degrees of freedom is therefore equal to 11. The critical values of the CvM distribution

for the different degrees of freedom are given in Harvey (2001). For the BSM, we take

the sum of individual test statistics for each frequency and compare it with the criti-

cal value of CvM distribution with 11 degrees of freedom. For the grouped FS-BSM,

the degrees of freedom depend on how many frequencies are placed in one group. In

the case of FS-BSM(5, 2, {4}/12), the CvM test statistic at seasonal frequency 4/12

has two degrees of freedom, while the remaining five frequencies ({1, 2, 3, 5, 6}/12) have

nine degrees of freedom.

From the results in Table 5.1, it is clear that log(U37AVS) requires a stochastic

seasonal component since the null hypothesis of a deterministic seasonal component is

rejected at 5% significance level according to the CvM seasonality test for both BSM

and FS-BSM(5, 2, {4}/12). The LR-test applied to the seasonal parameters also leads

Page 116: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

108 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

to rejection of H0 : σ2ω,I = σ2

ω,II which advocates the use of the FS model. The estimated

trend variance for FS-BSM(5, 2, {4}/12) is higher than the estimated variance for the

BSM and the estimated irregular variance is lower for the FS-BSM than for the BSM. In

sum, all results indicate that FS-BSM(5, 2, {4}/12) gives a better fit than the standard

BSM for furniture shipment time series log(U37AVS).

The estimated trend resulting from FS-BSM(5, 2, {4}/12) in Figure 5.4 is less smooth

than the estimated trend resulting from the BSM. The estimated seasonal component

for the FS-BSM is more erratic compared to the one for the BSM, see the same figure.

We also notice that the estimated irregular component for the FS-BSM is less volatile

than the one for the BSM. Some level of heteroskedasticity in both models can be

detected, but it is difficult to include this factor in the model given the modest length

of the time series.

The standard normality test and the Ljung-Box Q-statistic for the prediction errors

are reported in Table 5.2. From these diagnostics, it is clear that the FS-BSM fits

the series better than the BSM as there is no residual autocorrelation at any lags that

can be found in the standardised prediction errors produced by the FS-BSM. On the

contrary, the prediction errors produced from the BSM have traces of autocorrelation

at the lower lags. For both models, the normality test are satisfactory.

Figure 5.5 presents the seasonally adjusted time series for log(U37AVS). The FS-

BSM produces smoother seasonally adjusted series than the BSM. This result is caused

by stronger extraction of the seasonal component for each frequency in the FS-BSM.

Since a smoother seasonally adjusted series is often preferred in practice, we regard the

FS-BSM for U37AVS series as an improvement in the model-based seasonal adjustment

method, despite the rougher trend estimate.

As our second example, we analyse the log export of musical instruments series,

log(X41140), to illustrate that FS-BSMs do not always lead to better results. We

consider the BSM and 31 types of FS-BSMs to model log(X41140) and it turns out

that the estimated log-likelihoods do not really improve for the 31 choices of FS-BSMs,

see bottom panel of Table 5.1. Since the log-likelihoods from both models are equal,

the AIC(c) of FS-BSMs will be higher than the AIC(c) of the BSM. This indicates that

the BSM is preferred over the FS-BSMs to fit this particular time series. The CvM

seasonality test shows that the series has a close to fixed seasonal pattern. Modelling

the series with any FS-BSM does not lead to a better result. The Ljung-Box Q-statistic

of the BSM shows no significant residual autocorrelation in the prediction errors and this

concludes that modelling X41140 with the BSM with a nearly fixed seasonal component

is satisfactory.

Page 117: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 109

Table 5.1: Estimated parameters of log furniture shipment series, log(U37AVS), using

BSM and FS-BSM(5, 2, {4}/12) (top panel) and of log export musical instrument series,

log(X41140), using BSM and FS-BSM(5, 2, {5, 6}/12) (bottom panel).

BSM FS-BSM

Freq. Est.par S.E. CvM df Est.par S.E. CvM df

ση 0.0080 0.0043 ση 0.0132 0.0035

σζ 0.0011 0.0006 σζ 0.0006 0.0006

σε 0.0241 0.0031 σε 0.0154 0.0045

{i}/12 σω 0.0009 0.0005 2.910∗ 11 σω,I 0.0008 0.0004 2.513∗ 9

4/12 σω,II 0.0054 0.0016 1.032∗ 2

Log L 196.94 201.72

AIC -385.87 -393.44

AICc -385.52 -392.90

BSM FS-BSM

Freq. Est.par S.E. CvM df Est.par S.E. CvM df

ση 0.0262 0.0062 ση 0.0262 0.0062

σζ 5.12e-07 0.0005 σζ 8.64e-08 0.0005

σε 0.0634 0.0052 σε 0.0634 0.0052

{i}/12 σω 1.81e-07 0.0009 1.7834 11 σω,I 3.45e-07 0.0012 1.2939 8

{5, 6}/12 σω,II 1.96e-07 0.0015 0.4895 3

Log L 145.44 145.44

AIC -282.88 -280.88

AICc -282.61 -280.48

Notes: Maximum likelihood estimation of model (2.1)-(2.5) extended with regressors for trading days,

outliers, and Easter effect. Regression parameters are not included in AIC and AICc. Standard errors

are obtained using a numerical estimate of the Hessian of the log-likelihood and the delta method. ∗

means that H0 : σ2ω = 0 (for BSM) or H0 : σ2

ω,i = 0 (for FS-BSM) is rejected at 5% significance level

with i = I, II. For the BSM, all seasonal frequencies have the same variance so that the degrees of

freedom in the CvM seasonality test is equal to the sum of the individual degrees of freedom. For

the grouped FS-BSM, the degrees of freedom for each group equals the sum of individual degrees of

freedom within the tested group. Freq = seasonal frequency; Est.par = estimated parameters; S.E. =

estimated standard errors; CvM = Cramer-von Mises test statistic, df = CvM degrees of freedom.

Page 118: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

110 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

19921994

19961998

20002002

7.50

7.75

8.00

8.25Log(U

37AV

S)

Trend B

SM

19921994

19961998

20002002

−0.2

−0.1

0.0

0.1

0.2S

easonal BS

M

19921994

19961998

20002002

−0.050

−0.025

0.000

0.025

0.050Irregular B

SM

19921994

19961998

20002002

7.50

7.75

8.00

8.25Log(U

37AV

S)

Trend F

S−

BS

M

19921994

19961998

20002002

−0.2

−0.1

0.0

0.1

0.2S

easonal FS

−B

SM

19921994

19961998

20002002

−0.050

−0.025

0.000

0.025

0.050Irregular F

S−

BS

M

Figu

re5.4:

Tim

eseries

decom

position

oflog

furn

iture

and

kitch

encab

inet

ship

men

tseries,

log(U37A

VS

),u

sing

BS

M(to

pro

w)

and

FS

-BS

M(5,2,{

4/12})

(botto

mro

w).

Page 119: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 111

Table 5.2: p-values from Ljung-Box Q-Statistics and Normality test of the resid-

uals of log furniture shipment series, log(U37AVS), resulting from BSM and FS-

BSM(5, 2, {4/12}).

LB Statistics BSM FS-BSM LB Statistics BSM FS-BSM

Q(5) 0.0027 ∗∗∗ - Q(15) 0.0473 ∗ 0.5748

Q(6) 0.0011 ∗∗∗ 0.0505 Q(16) 0.0453 ∗ 0.5318

Q(7) 0.0007 ∗∗∗ 0.1271 Q(17) 0.0566 0.4702

Q(8) 0.0015 ∗∗∗ 0.0738 Q(18) 0.0757 0.5247

Q(9) 0.0025 ∗∗∗ 0.1391 Q(19) 0.0981 0.5789

Q(10) 0.0051 ∗∗ 0.1934 Q(20) 0.1293 0.6445

Q(11) 0.0098 ∗∗ 0.2859 Q(21) 0.0572 0.2279

Q(12) 0.0147 ∗ 0.3890 Q(22) 0.0761 0.2777

Q(13) 0.0239 ∗ 0.4947 Q(23) 0.0970 0.3336

Q(14) 0.0379 ∗ 0.5196 Q(24) 0.0985 0.2841

Normality test 0.5494 0.6974

∗,∗∗ , and ∗∗∗ symbolise that the p-value is asymptotically significant at 5%, 1%, and

0.5% rejection levels respectively, with H0 of no autocorrelation in the residuals.

Page 120: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

112 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

1 2 3 4 5 6 7 8 9 10 11 12

−0.2

−0.1

0.0

0.1

0.2Seasonal BSM

1992 1994 1996 1998 2000 2002

7.6

7.8

8.0

8.2SA BSM

1 2 3 4 5 6 7 8 9 10 11 12

−0.2

−0.1

0.0

0.1

0.2Seasonal FS−BSM

1992 1994 1996 1998 2000 2002

7.6

7.8

8.0

8.2SA FS−BSM

Figure 5.5: Estimated seasonal component of log furniture shipment series, log(U37AVS),

plotted separately for each month and the seasonally adjusted series. BSM’s estimates are

shown at the top row while the estimates from FS-BSM(5,2,{ 4 }/12) are presented at the

bottom row.

Page 121: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 113

5.4.2 BSM vs FS-BSM in 75 time series

To obtain a full assessment of frequency specific models, we analyse 75 time series using

the standard BSM and the grouped FS-BSMs. For each time series, we fit 31 grouped

FS-BSMs and estimate the 5 parameters to obtain σ2η, σ

2ζ , σ

2ω,I , σ

2ω,II , and σ2

ε . Given

these estimates, we compute the MA(13) coefficients of the reduced form FS-BSMs and

the corresponding innovation variance. Next we compute the log-likelihood function of

the series based on these MA coefficients and the innovation variance. We then take the

model with the lowest corrected Akaike criterion function (AICc) as our final model. In

terms of numbers, 57 out of 75 time series (76%) are fitted better by at least one type

of FS-BSMs compared to the BSM according to the AIC, 55 series (73%) according to

the AICc and 35 series (46%) according to the LR-test with a significance level of 5%.

These percentages show the potential of the grouped FS-BSMs as an improvement to

the standard BSM.

5.4.3 How close are these (FS-) models to each other?

In Section 5.3 we have discussed the distance metric D defined in (5.16). It indicates

how close the linear dynamic properties of the two models are to each other. Using

this distance metric, we compare the UC-type of models (BSM and FS-BSMs) to the

ARIMA-type of models (Airline model and FS-AMs). All models are estimated using

exact maximum likelihood and the same set of extra regressors for each series.

In Table 5.3 we compare the restricted reduced form MA(13) coefficients of the

Airline model with those of the restricted reduced form BSM for the U37AVS series (in

logs). For the FS models, we compare FS-AM 4-4-2(4) and the restricted reduced form

FS-BSM(5,2,{4}/12). The Airline model and the FS-AMs are also a restricted MA(13)

model. The estimates of the MA(13) coefficients for the Airline model and the FS-AMs

are denoted by θ∗i while for BSM and FS-BSMs these coefficients are denoted by θi with

i = 1, . . . , 13.

We notice that the MA(13) coefficients of the reduced form BSM at lag 2 until 11

(θ2, . . . , θ11 in Table 5.3) tend to zero while the values of θ1, θ12 and θ13 tend to θ∗1, θ∗12 and

θ∗13 of the Airline model. The distance metric D between the two models is computed

at 0.128. D = 0 means that both models produce the same time series decomposition.

Note that time series decomposition for the Airline model is based on a canonical

decomposition as in Hillmer and Tiao (1982). For the FS version of the models, we have

opted for FS-BSM(5,2,{4}/12) as it produced the lowest AIC(c). For the 5-parameter

FS-AMs, the FS-AM 4-4-2(4) produced the lowest AIC(c). These particular FS models

also gave the smallest distance metric among all other combinations and D is computed

at 0.065 for the log(U37AVS) series. Note that both FS models show MA-coefficients

Page 122: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

114 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

with an intra-annual periodic pattern.

Table 5.3: Restricted reduced form MA(13) coefficients of the (FS-)AM and (FS-)BSM

applied to log furniture shipment series, log(U37AVS).

Airline model BSM FS-AM 4-5-1(4) FS-BSM(5,2,{4}/12)

i θ∗i θi θ∗i θi

1 0.609 0.651 0.607 0.634

2 0 -0.045 0.051 0.021

3 -0.010 -0.348 -0.340

4 -0.043 0.269 0.260

5 . -0.010 0.046 0.046

6 . -0.040 -0.319 -0.345

7 . -0.008 0.247 0.269

8 -0.037 0.043 0.018

9 -0.006 -0.293 -0.282

10 -0.032 0.227 0.230

11 0 -0.002 0.039 0.059

12 0.667 0.697 0.440 0.431

13 -0.406 -0.478 -0.222 -0.211

log(σ) -3.341 -3.355 -3.394 -3.398

Log L 196.162 196.937 201.569 201.722

D 0.128 0.065

Next we compute the MA(13) coefficients of the reduced form BSM and FS-BSMs

and the corresponding distance metrics for all series in the database. The distance

metrics are presented in Figure 5.6. The smallest distance for each series does not

always correspond to the combination of lowest AIC(c) from FS-BSMs and FS-AMs.

However, the combination of lowest AIC(c) from both types of FS models does give

a distance metric that is smaller than the distance metric resulting from the non-FS

models. Therefore, the two FS classes of models are closer than the non-FS versions.

Two cases are of particular interest: series number 20 (U34KTI, Manufacturers’

Total Inventory of Electro-medical, Measuring and Control Instruments, dated from

January 1992 to September 2001) and series number 74 (X41140), exports of musical

instruments. These two time series have respectively the largest and smallest distance

metrics in the comparison between BSM and Airline model. The MA(13) coefficients

and the distance metrics for these series are presented in Table 5.4. For time series

Page 123: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 115

0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75

0.25

0.50

0.75Distance between reduced form BSM and Airline model

0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75

0.25

0.50

0.75Minimum distance between reduced form 5−coefficients FS−BSM and FS−AM

Figure 5.6: (Top panel) Distance metric D in (5.16) between estimated BSM and Airline

models applied to 75 manufacturing and export monthly time series from the U.S. Census

Bureau database. (Bottom panel) The smallest distance between 31 types of grouped FS-

BSMs and 4-coefficients FS-AMs. For each series we find a smaller distance metric between

the FS models compared to the distance between the non-FS models.

Page 124: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

116 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

Tab

le5.4:

Restricted

reduced

formM

A(13)

coeffi

cients

ofth

e(F

S-)A

Man

dth

e(F

S-)B

SM

applied

tolog(U

34KT

I),m

anufactu

ring

instru

men

tin

ventory

series,in

the

first

2colu

mns

and

log(X41140),

exp

ortof

musical

instru

men

tseries,

inth

elast

2colu

mns.

Airlin

em

odel

BSM

FS-A

MF

S-B

SM

Airlin

em

odel

BSM

FS-A

MF

S-B

SM

4-5-1(4)(5,2,{1,2,4}/12)

4-5-1(2)(5,2,{5,6}/12)

iθ∗i

θi

θ∗i

θi

θ∗i

θi

θ∗i

θi

1-0.121

-0.078-0.127

-0.1240.663

0.6630.663

0.663

20

-0.251-0.212

-0.2200

9.87e-005-4.70e-016

5.77e-005

3-0.218

-0.298-0.266

-7.63e-0061.68e-015

-3.93e-006

4.

-0.254-0.199

-0.196.

1.12e-0051.39e-015

6.36e-006

5.

-0.220-0.196

-0.210.

1.38e-006-7.47e-017

9.55e-007

6.

-0.251-0.275

-0.324.

-1.51e-006-3.51e-015

-1.03e-006

7-0.219

-0.183-0.203

-5.81e-0061.87e-015

-3.41e-006

8-0.244

-0.180-0.193

-1.23e-005-2.85e-015

-7.62e-006

9-0.208

-0.254-0.241

-3.42e-005-2.74e-015

-2.01e-005

100

-0.213-0.169

-0.1840

-2.88e-005-2.86e-015

-1.80e-005

11-0.126

-0.166-0.175

-1.99e-004-5.82e-015

-1.18e-004

120.668

0.4640.490

0.4941.000

0.9991.000

0.999

130.081

-0.046-0.064

-0.067-0.663

-0.663-0.663

-0.663

log(σ

)-4.306

-4.331-4.348

-4.347-2.5531

-2.5527-2.5531

-2.5529

Log

L296.705

299.491300.674

300.709145.44

145.44145.44

145.44

D0.748

0.0680.0008

0.0005

Page 125: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.4. EMPIRICAL RESULTS 117

U34KTI, the distance between the non-FS models is calculated at 0.748 while the

smallest distance between the FS models is only 0.068. The time series decomposition

based on the BSM and the Airline model will be different while for FS models the

decomposition will be similar.

Time series X41140 has the smallest distance in both FS and non-FS model compar-

isons with a distance as small as 0.0008 between the BSM and the Airline model, and a

minimum distance of 0.0005 in the comparison between FS-AMs and FS-BSMs. In this

case, modelling the time series using an FS model is not necessary since the estimated

seasonal component is estimated as fixed, that is θ∗12 = θ12 = 1 for all models in the last

four columns of Table 5.4. For the restricted reduced form BSM, the estimated MA

coefficient θ12 = 1 corresponds to σω ≈ 0 in the bottom panel of Table 5.1. The small

distance metric between the BSM and the Airline model is also due to the slope esti-

mate in the BSM which is almost constant. Maravall (1985) showed that if both slope

and seasonal components are estimated with zero variances (BSM with deterministic

seasonal and smooth trend component), the BSM will be equal to the Airline model

with a unit root in the seasonal coefficient. This findings applies to the case of the FS

models as well.

Figure 5.7 presents a 3-dimensional graph of the relative distance as defined in

(5.17). The model numbers as defined in Tables 6 and 7 are used as axes coordinates in

the FS-BSM and FS-AM direction. Models with a single specific frequency are close to

the origin as they are numbered as one to six. We computed the average of the relative

distance when RD > 0 for all series in the database using 31 types of FS-BSMs and

the 4-coefficient version of FS-AMs. The relative distance is positive if the distance

between FS-BSMs and FS-AMs is smaller than the distance between non-FS models.

For all series in the database, we are always able to find a distance in the FS models

that is smaller than the distance in the non-FS models. The highest distance reduction

occurs on the diagonal of the 3-dimensional graph in Figure 5.7 where FS-BSMs and

FS-AMs have the same frequency.

All results in this subsection are for FS-BSMs and FS-AMs with only two groups

seasonal parameters and a total of five coefficients. We have also considered the esti-

mation of models with six different variances for the six different seasonal frequencies

in the FS-BSM. When we compare the resulting distances from their FS-AM counter-

parts, the empirical models are even more similar. However, we do not advocate such

over-parametrized specifications as the estimation results can be spurious with many

seasonal variances frequently estimated as zero in the FS-BSM and many seasonal unit

roots in the FS-AM.

Page 126: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

118 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

FS−AM

FS−BSM

510

1520

2530

10

20

30

0.1

0.2

Figure 5.7: Average of relative distance of non-FS models versus FS models (see equation

(5.17)) when RDm,n > 0 over 75 time series from the U.S. Census Bureau. The relative

distance metrics are computed for all 31 combinations of FS-BSMs and the 4-coefficient FS-

AMs. The largest relative distances are found in the diagonal.

Page 127: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.5. SUMMARY AND CONCLUSIONS 119

5.4.4 Model fit comparison

Figure 5.8 shows histograms of the differences in AICc between non-FS and FS models.

The difference in model performance is negative if the (FS-)BSM fits the time series

better than (FS-)AM. The top panel in Figure 5.8 suggests that the Airline model

performs better than the BSM in 80% of the cases. This due to the four parameters

of the BSM compared to the three parameters of the Airline model. Although the log-

likelihood values are similar, the AICc of the Airline model is lower than the AICc of

the BSM. The bottom panel of Figure 5.8 shows the difference in the minimum AICc

between the FS-BSMs and the 4-coefficients FS-AMs. Both FS models seem to perform

equally well as the median lies around zero. The FS-AMs outperform the FS-BSMs in

54% of the cases.

−15.0 −12.5 −10.0 −7.5 −5.0 −2.5 0.0 2.5 5.0 7.5 10.0 12.5 15.0

10

20

AICc(BSM) − AICc(Airline model)

BSM fits better

Airline model fits better

−15.0 −12.5 −10.0 −7.5 −5.0 −2.5 0.0 2.5 5.0 7.5 10.0 12.5 15.0

10

20

min{AICc(FS−BSM)} − min{AICc(FS−AM)}

FS−BSM fits better

FS−AM fits better

Figure 5.8: Top panel: histogram of differences in minimum AICc between the Airline model

and the BSM computed for the 75 U.S. Census Bureau’s time series. Bottom panel: histogram

of difference in minimum AICc between FS-BSMs and FS-AMs.

5.5 Summary and conclusions

We investigate two types of frequency specific time series models for seasonal time

series. We consider one unobserved components models, the so-called basic structural

model (BSM), and one SARIMA model, the so-called Airline model (AM). First, we

explore the generalisation of the basic structural time series model where time-varying

Page 128: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

120 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

trigonometric terms associated with different seasonal frequencies are allowed to have

different variances. We label this extension as the frequency specific basic structural

model (FS-BSM). We then explore the dynamic properties of the FS-BSM and relate

them to those of the frequency specific Airline model (FS-AM). The FS-AM is a recent

extension of the well-known Airline model and is developed to improve the properties

of model-based seasonal adjustment procedures based on it. The relations between FS-

BSM and FS-AM coefficients are highly nonlinear. We rely on numerical techniques

to investigate the differences between the two types of FS models. In particular, we

develop a distance measure based on the restricted reduced form MA-coefficients of

the models for appropriately differenced time series. Two models (UC versus SARIMA

models, FS and non-FS models) have similar properties if our distance measure is small.

In general, UC models will not exactly be the same as SARIMA models except for some

special cases. We show for our simulated and empirical data that in general estimated

FS models applied to the same time series have a smaller distance compared to non-FS

models.

In the empirical study on 75 monthly U.S. Census Bureau time series, we find that

among the non-FS models, the Airline model performs better than the BSM in term of

AIC(c). Although the estimated log-likelihoods are similar, the Airline model is more

parsimonious than the BSM. For the FS models we conclude that the grouped FS-BSMs

perform as well as the 4-coefficient FS-AMs in term of minimum AIC(c). Both types of

grouped FS models can be use as alternatives and they often perform better than the

standard Airline model and the BSM respectively. Neither model type is consistently

better in all situations as they have different ways of restricting parameters to make

them estimable using reasonable amounts of data. Dependent on the characteristics of

the analysed time series, one can make an initial choice whether to use the FS models

or to stay with the standard (or non-FS) models.

Page 129: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.6. APPENDICES 121

5.6 Appendices

5.A ARIMA representation

This appendix is used in subsection 5.3.1 and we give algebraic derivations of ARIMA

representation of models with individual trigonometric seasonal component. Note that

Bell (2004) reports these results without the algebraic derivations for the monthly sea-

sonal processes. Since it is straightforward but not trivial to get the reported results,

we provide the complete analytical computations in this appendix.

We assume s is even to simply notation. In general we have the following equality,

γt =

s/2∑j=1

γj,t =

s/2∑j=1

(1− cosλjL)ωj,t + (sinλjL)ω∗j,t1− 2 cosλjL+ L2

=

s/2∑j=1

(1− αjL)wj,t1− 2 cosλjL+ L2

, (5.18)

where λj = 2πj/s and wj,t is an independent and identically distributed random variable

with zero mean and variance σ2j for j = 1, . . . , s/2. In Section 5.3 we have introduced

the differencing operator δj(L) = (1 − 2 cosλjL + L2). It follows that the differenced

individual trigonometric seasonal component, δj(L)γj,t, is a moving-average of order one

(MA(1)) process with two unknown parameters, namely αj and σ2j for each seasonal

frequency j = 1, . . . , s/2, or in mathematical notation

δj(L)γj,t = (1− αjL)wj,t. (5.19)

Since the autocovariance functions (ACVF) of an MA(1) process is zero after lag one,

we have two equations for each frequency to determine the two unknown parameters

which are straightforward to solve. Thus, for each frequency j, we will determine the

ACVF for both right and left hand sides of the following equality

(1− cosλjL)ωj,t + (sinλjL)ω∗j,t = (1− αjL)wj,t, (5.20)

to determine αj and σ2j from a given σ2

ω,j with j = 1, . . . , s/2.

The ACVF of the left-hand side of (5.20) has the same structure for all frequencies

j = 1, . . . , s/2, namely

γ(0) = σ2ω,j + (cos2 λj)σ

2ω,j + (sin2 λj)σ

2ω,j = 2σ2

ω,j, (5.21)

γ(1) = − cos(λj)σ2ω,j. (5.22)

Note that at frequency λj = π/2, we have cos(λj) = cos(π/2) = 0, sin(λj) = sin(π/2) =

1, and the left-hand side of (5.20) will be a white noise process which makes γ(1) = 0

in (5.22). Note further that at frequency λj = π, cos(λj) = cos(π) = −1, which reduces

equation (5.22) to σ2ω,s/2.

Page 130: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

122 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

The right-hand side of (5.20) is a standard MA(1) process for all frequencies, which

gives the following ACVF for j = 1, . . . , s/2,

γ(0) = (1 + α2j )σ

2j , (5.23)

γ(1) = −αjσ2j . (5.24)

By matching the ACVF of left-hand side and right-hand side, we can compute the value

of αj for every frequency j, and we also get the expression of σ2j in term of σ2

ω,j, namely

σ2j =

2σ2ω,j

(1 + α2j ), for j = 1, . . . , s/2− 1, (5.25)

and for j = s/2, we have

σ2s/2 = σ2

ω,s/2. (5.26)

For s = 12, we derive the ACVF for the first seasonal frequency, λ1 = 2π/12, as an

illustration. Other frequencies follow analogously. Given that cos(2π/12) = 0.5√

3, we

get

δ1(L) = 1− 2 cosλ1L+ L2 = 1− 2 cos(2π/12)L+ L2 = 1−√

3L+ L2

and the following equality

(1− cosλ1L)ω1,t + (sinλ1L)ω∗1,t = (1− α1L)w1,t

⇔ (1− 0.5√

3L)ω1,t + 0.5ω∗1,t−1 = (1− α1L)w1,t.

The ACVF of the left-hand side is given by

γ(0) = σ2ω,1 + 0.75σ2

ω,1 + 0.25σ2ω,1 = 2σ2

ω,1,

γ(1) = −0.5√

3σ2ω,1,

while the ACVF of the right-hand side is given by

γ(0) = (1 + α21)σ

21,

γ(1) = −α1σ21.

Since the right-hand side should be equal to the left-hand side, we solve the following

equations

2σ2ω,1 = (1 + α2

1)σ21, (5.27)

−0.5√

3σ2ω,1 = −α1σ

21, (5.28)

Page 131: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.6. APPENDICES 123

which give two possible solutions for α1, namely 13

√3 and

√3. Since the MA(1) process

needs to be invertible, there is only one solution that fits, namely α1 = 13

√3. Substi-

tuting α1 back into (5.27) or (5.28) gives σ21 = 1.5σ2

ω,1. Thus, the MA(1) representation

of the differenced seasonal component at the first seasonal frequency is given by

δ1(L)γ1,t = (1−√

3L+ L2)γ1,t =(

1− 13

√3L)w1,t, (5.29)

where w1,t is an iid sequence with zero mean and variance σ21.

Repeating the above exercise for the remaining seasonal frequencies results in solving

for αj as a real number between -1 and 1, and expressing σ2j in terms of σ2

ω,j. Table 5.5

(adapted from Bell (2004)) summarises the ARIMA representation for the individual

trigonometric seasonal components of the monthly seasonal processes.

Table 5.5: ARIMA representation for the individual trigonometric seasonal component

for s = 12

j λj δj(L)γj,t = (1− αjL)wj,t σ2j

1 2π/12 (1−√

3L+ L2)γ1,t = (1− 13

√3L)w1,t 1.5σ2

ω,1

2 4π/12 (1− L+ L2)γ2,t = (1− (2−√

3)L)w2,t (1 + 12

√3)σ2

ω,2

3 6π/12 (1 + L2)γ3,t = w3,t 2σ2ω,3

4 8π/12 (1 + L+ L2)γ4,t = (1 + (2−√

3)L)w4,t (1 + 12

√3)σ2

ω,4

5 10π/12 (1 +√

3L+ L2)γ5,t = (1 + 13

√3L)w5,t 1.5σ2

ω,5

6 12π/12 (1 + L)γ6,t = w6,t σ2ω,6

5.B Computing autocovariance function of non-stationary pro-

cesses in their reduced form

This appendix is used in subsection 5.3.2. More details are given by McElroy (2008).

Let yt be the sum of a number of unobserved components X(i)t , like trend, seasonal and

irregular components,

yt =

p∑i=1

X(i)t , (5.30)

where p is the number of unobserved components and where X(i)t ’s are such that there

exist ‘differencing’ operators

δ(i)(L) = 1 +

di∑j=1

δ(i)j L

j, (5.31)

Page 132: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

124 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

with di ≥ 0, and the property that for each individual X(i)t ,

δ(i)(L)X(i)t = W

(i)t (5.32)

is a stationary mean zero process with known autocovariance function since the X(i)t ’s

are well-defined processes. Suppose further that

1. the polynomials δ(i)(L) have no common zeroes for i = 1, . . . , p;

2. the series W(i)t are uncorrelated with one another at all leads and lags for i =

1, . . . , p.

Then the minimal order differencing operator for the process yt is defined by

δ(L) =

p∏i=1

δ(i)(L). (5.33)

Thus, by defining

δ(i)(L) =

p∏j=1, j 6=i

δ(j)(L), (5.34)

we produce the reduced form process Wt as follows,

Wt = δ(L)yt =

p∑i=1

δ(i)(L)W(i)t . (5.35)

Hence, given the non-stationary process yt, the autocovariance matrix function of the

reduced form process Wt is given by

ΣW =

p∑i=1

∆(i)Σ(i)∆(i)′ , (5.36)

where ∆(i) implements δ(i)(L) and Σ(i) is the autocovariance matrix of W(i)t . The

dimension of ΣW matrix is (n − d) × (n − d) where n is the number of observations

and d is the highest lag polynomial in Wt. The dimension of matrices ∆(i) and Σ(i) are

(n−d)× (n−k) and (n−k)× (n−k) respectively where k is the highest lag polynomial

in δ(i)(L).

5.C Summary of Frequency Specific Airline Model (FS-AM)

The FS-AMs are mentioned throughout this chapter. We give a summary following the

paper of Aston et al. (2007). For a seasonal time series yt, the airline model has the

form

(1− L)(1− Ls)yt = (1− θL)(1−ΘLs)εt, (5.37)

Page 133: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.6. APPENDICES 125

where s ≥ 2 and the disturbance εt is a zero mean independent and identically dis-

tributed random variable with finite variance σ2. The lag operator L means Lnxt = xt−n

for any integer n = . . . ,−1, 0, 1, . . . . When Θ > 0, the Airline model (5.37) can be writ-

ten as

(1− L)(1− Ls)yt = (1− θL)(1−Θ1/sL)(∑s−1

j=0 Θj/sLj)εt, (5.38)

=(

1− (θ + Θ1/s)L+ θΘ1/sL2)(∑s−1

j=0 Θj/sLj)εt. (5.39)

Note that both the non-seasonal polynomial 1− (θ+Θ1/s)L+θΘ1/sL2 and the seasonal

polynomial∑s−1

j=0 Θj/sLj contain Θ1/s. This may partially explain why values of the

seasonal parameter Θ can considerably influence the trend component in empirical work.

Aston et al. (2004) substituted a general second-degree moving average polynomial for

the non-seasonal part in (5.39), resulting in the generalised airline model,

(1− L)(1− Ls)yt =(

1− aL+ bL2)(∑s−1

j=0 cjLj)εt, (5.40)

where the seasonal sum polynomial has a third coefficient c, distinct from the non-

seasonal coefficients a and b to break the seasonal and non-seasonal dependency. Aston

et al. (2007) decomposed the seasonal factor∑s−1

j=0 cjLj into several factors with different

coefficients for different frequencies. By expanding the seasonal factor

s−1∑j=0

cjLj = (1 + cL)

s/2−1∏i=1

(1− 2c cos(2πi/s)L+ c2L2

), (5.41)

we can obtain a generalisation of the right-hand-side with different coefficients c at

different seasonal frequencies. For monthly series, the generalisation becomes

(1 + c6L)5∏i=1

(1− 2ci cos(2πi/12)L+ c2iL

2).

Thus, the so-called general frequency-specific model for s = 12 is written by

(1− L)(1− L12)yt = (1− aL− bL2)×[(1 + c6L)

∏5i=1

(1− 2ci cos(2πi/12)L+ c2iL

2)]εt.

(5.42)

If the frequency factors ci’s are unconstrained, model (5.42) contains 9 parameters,

that is a, b, c1, . . . , c6 and σ2. For monthly macro-economic time series of typical length

(say 8 to 13 years), empirical work has learned us that all MA coefficients cannot

be estimated reliably as typical unit root problems occur where one or several roots

of the MA-polynomial end up on the unit circle. Aston et al. (2007) proposed a set

of restrictions with either 3 MA coefficients (4 parameters) or 4 MA coefficients (5

parameters) that can be estimated freely.

Page 134: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

126 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

The 4-coefficient model types are denoted as 4− 5− 1(i), 4− 4− 2(i, j) and 4− 3−3(i, j, k). The 4− 5− 1(6) type is written as

(1− L)(1− L12)yt = (1− aL− bL2)×[(1 + c2L)

∏5i=1

(1− 2c1 cos(2πi/12)L+ c21L

2)]εt,

(5.43)

the 4− 4− 2(1, 6) type as

(1− L)(1− L12)yt = (1− aL− bL2)×[(1 + c2L)(1− 2c2 cos(2π/12)L+ c22L

2)∏5

i=2

(1− 2c1 cos(2πi/12)L+ c21L

2)]εt,

(5.44)

and the 4− 3− 3(1, 2, 6) type as

(1− L)(1− L12)yt = (1− aL− bL2)×[(1 + c2L)

∏2i=1(1− 2c2 cos(2πi/12)L+ c22L

2)∏5

i=3

(1− 2c1 cos(2πi/12)L+ c21L

2)]εt.

(5.45)

There are 6 models of type 4− 5− 1(i), 15 models of type 4− 4− 2(i, j) and only 10

models of type 4− 3− 3(i, j, k).

Aston et al. (2007) also consider a 3-coefficient model version with common param-

eter in the zero frequency and seasonal frequency polynomials. In total there are 72

types of models of FS-AM: 41 types with 3-coefficient and 31 types with 4-coefficient.

In relation to the FS-BSM, the 31 types of FS-BSM have the same classification as the

4-coefficient models in FS-AM which is the reason that we only consider the 4-coefficient

version of FS-AMs.

5.D Preliminary procedure for model selection

A general procedure to select a model for seasonal time series:

1. Estimate the data with BSM and see whether we can produce reasonable decom-

position and diagnostic checks.

2. Take the reduced form of the time series series and draw the sample ACF to check

whether we have more significant lags between lag 1 and 13. If no, then do not

proceed to FS-BSMs. If yes, we estimate the series using 31 possible FS-BSMs

and choose the highest log-likelihood.

3. Check whether the decomposition and diagnostics resulting from the chosen FS-

BSM show some improvements compared to the results from the BSM. If no, go

back to the BSM. If yes, choose this FS-BSM as the final model.

Page 135: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

5.6. APPENDICES 127

5.E Model list for grouped FS-BSMs for s = 12

Table 5.6: Model list for the grouped FS-BSMs(5, 2, {·})

{i}/12 {i, j}/12 {i, j, k}/12

1. {1}/12 7. {1, 2}/12 22. {1, 2, 3}/12

2. {2}/12 8. {1, 3}/12 23. {1, 2, 4}/12

3. {3}/12 9. {1, 4}/12 24. {1, 2, 5}/12

4. {4}/12 10. {1, 5}/12 25. {1, 2, 6}/12

5. {5}/12 11. {1, 6}/12 26. {1, 3, 4}/12

6. {6}/12 12. {2, 3}/12 27. {1, 3, 5}/12

13. {2, 4}/12 28. {1, 3, 6}/12

14. {2, 5}/12 29. {1, 4, 5}/12

15. {2, 6}/12 30. {1, 4, 6}/12

16. {3, 4}/12 31. {1, 5, 6}/12

17. {3, 5}/12

18. {3, 6}/12

19. {4, 5}/12

20. {4, 6}/12

21. {5, 6}/12

In the monthly data case, there are 6 different FS-BSM(5, 2, {i}/12), 15 different FS-

BSM(5, 2, {i, j}/12) and 10 different FS-BSM(5, 2, {i, j, k}/12) possible with a total of

31 models, each with five free parameters and two groups of seasonal frequencies.

Page 136: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

128 CHAPTER 5. FREQUENCY-SPECIFIC TRIGONOMETRIC SEASONALITY

Page 137: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Chapter 6

Conclusions

This thesis considers several ways of extending the linear unobserved components (UC)

time series models with periodic coefficients. Special attention is given to identification

of the parameters prior to estimation. It is shown that exact maximum likelihood es-

timation is feasible despite the large number of parameters that needs to be estimated

for this class of periodic models using state space methods. This finding is particularly

helpful for the estimation of parameters in multivariate versions of periodic UC models,

since the number of parameters tends to grow large rapidly. Several empirical illus-

trations demonstrate the improvements obtained by periodic models in the analysis of

seasonal macroeconomic time series.

The primary aim of Chapter 2 is to present a comprehensive periodic time se-

ries analysis based on the univariate unobserved components model with four periodic

stochastic components, namely the stochastically time-varying effects for trend, sea-

sonal, cycle and irregular. For the application of monthly postwar U.S. unemployment

series, it is shown that significant periodicity is present in all parameters, especially in

the cyclical component.

Chapter 3 is a multivariate extension of Chapter 2 where we use one periodic com-

mon cycle component for all time series involved. The empirical study concerns a

dataset of seven U.S. sectoral employment quarterly time series. It is shown that the

seven time series can be effectively modelled by a multivariate periodic unobserved

components approach with idiosyncratic components for trend, season and irregular

together with a common cyclical component.

In Chapter 4, a feasible estimation method for periodic seasonal autoregressive inte-

grated moving average models (periodic SARIMA) is developed in a way that does not

require a-priori differencing of the data. Convenient state space representations of the

periodic SARIMA models are proposed to facilitate model identification, specification

and exact maximum likelihood estimation of the periodic parameters. The same time

series as in Chapter 2 is used for an illustration. A comparison is made between the

Page 138: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

130 CHAPTER 6. CONCLUSIONS

results of periodic SARIMA and periodic UC models.

Chapter 5 focuses solely on the the time-varying trigonometric seasonal component

of the basic structural model (BSM), and extends it by allowing different seasonal fre-

quencies to have different variances. This new model is referred to as the frequency

specific basic structural model (FS-BSM). The dynamic properties of the FS-BSM are

explored and are compared to those of the frequency specific Airline model (FS-AM).

Using 75 seasonal time series provided by the U.S. Census Bureau, it is shown empir-

ically that both frequency specific models have properties that are very close to each

other. Therefore FS-BSMs lead to similar time series decompositions for a specific range

of models in the FS-AMs.

Further research in periodic time series models can be taken along several paths.

In shorter samples and for higher frequencies, placing smoothness restrictions on the

intra-year pattern of the periodic coefficients may be necessary to obtain sufficiently

parsimonious models. An automatic model selection procedure as in the programs X-

12-ARIMA and TRAMO/SEATS could also be designed. Extensions of the statistical

analysis are also possible within our framework and might be realised in the future.

Following Kurozumi (2002) and Busetti and Harvey (2003), stationarity tests for the

periodic components in the model can be developed.

Another extension is allowing for more than one independent trend component to

bridge the gap with independent periodic models which specify independent trends for

different months of the year. In this case, non-trivial identification problems for seasonal

adjustment can arise as explained in the introduction. More flexible trend and cycle

specifications can be introduced into the model while automatic seasonal adjustment

would require some extensions to our procedures as well. Different non-linear model

extensions can be considered, but they require the use of non-linear or non-Gaussian

state space formulations which increase the computational complexity.

In short, there are many possibilities to explore the periodic time series model

framework further. Whether such extensions will result in better decompositions or

will produce more accurate forecasts remain to be seen.

Page 139: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Bibliography

Anderson, B. D. O. and J. B. Moore (1979). Optimal Filtering. Englewood Cliffs: Prentice-

Hall.

Anderson, P. and M. M. Meerschaert (2005). Parameter estimation for periodically stationary

time series. Journal of Time Series Analysis 26, 489–518.

Anderson, P. L. and A. V. Vecchia (1993). Asymptotic results for periodic autoregressive

moving-average processes. Journal of Time Series Analysis 14, 1–18.

Ansley, C. F. (1979). An algorithm for the exact likelihood of a mixed autoregressive-moving

average process. Biometrika 66, 59–65.

Ansley, C. F. and R. Kohn (1985). Estimation, filtering and smoothing in state space models

with incompletely specified initial conditions. Annals of Statistics 13, 1286–1316.

Aston, J. A. D., D. F. Findley, T. S. McElroy, K. C. Wills, and D. E. K. Martin (2007). New

ARIMA models for seasonal time series and their application to seasonal adjustment and

forecasting. Research Report Series RRS2007/14, U.S. Census Bureau.

Aston, J. A. D., D. F. Findley, K. C. Wills, and D. E. K. Martin (2004). Generalizations of the

Box-Jenkins’ Airline Model with frequency-specific seasonal coefficients and a generalization

of Akaike’s MAIC. In Proceedings of the 2004 NBER/NSF Time Series Conference, Wash-

ington D.C. and Census Bureau, http://www.census.gov/ts/papers/findleynber2004.pdf.

NBER.

Aston, J. A. D. and S. J. Koopman (2006). A non-Gaussian generalisation of the Airline

model for robust seasonal adjustment. Journal of Forecasting 25, 325–349.

Azevedo, J. V. E., S. J. Koopman, and A. Rua (2006). Tracking the business cycle in the

Euro are: A multivariate model-based bandpass filter. Journal of Business & Economic

Statistics 24, 278–290.

Barsky, R. B. and J. A. Miron (1989). The seasonal cycle and the business cycle. Journal of

Political Economy 97, 503–534.

Beaulieu, J. J. and J. A. Miron (1993). Seasonal unit roots in aggregate U.S. data. Journal

of Econometrics 55, 305–328.

Page 140: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

132 BIBLIOGRAPHY

Bell, W. R. (1993). Empirical comparisons of seasonal ARIMA and ARIMA component

(structural) time series models. In Proceedings of the Business and Economic Statistics

Section, pp. 226–231. Alexandria, VA: American Statistical Association.

Bell, W. R. (2004). On RegComponent time series models and their applications. In A. Harvey,

S. J. Koopman, and N. Shephard (Eds.), State Space and Unobserved Component Models,

pp. 248–283. Cambridge, U.K.: Cambridge University Press.

Bell, W. R. and S. C. Hillmer (1984). Issues involved with the seasonal adjustment of economic

time series. Journal of Business & Economic Statistics 2, 291–320.

Bloomfield, P., H. L. Hurd, and R. B. Lund (1994). Periodic correlation in stratospheric ozone

data. Journal of Time Series Analysis 15, 127–150.

Boswijk, H. P. and P. H. Franses (1996). Unit roots in periodic autoregressions. Journal of

Time Series Analysis 17, 221–245.

Bowman, K. O. and L. R. Shenton (1975). Omnibus test contours for departures from nor-

mality based on√b1 and b2. Biometrika 62, 243–50.

Box, G. E. P. and G. M. Jenkins (1970). Time Series Analysis, Forecasting and Control.

Holden-Day, San Francisco, CA, USA.

Brockwell, P. J. and R. A. Davis (1993). Time Series: Theory and Methods (2nd ed.). USA:

Springer-Verlag, New-York.

Burman, J. P. (1980). Seasonal adjustment by signal extraction. Journal of Royal Statistical

Society 143, 321–337.

Burridge, P. and R. Taylor (2004). Bootstrapping the HEGY seasonal unit root tests. Journal

of Econometrics 123, 67–87.

Busetti, F. and A. Harvey (2003). Seasonality tests. Journal of Business & Economic Statis-

tics 21, 420–436.

Canova, F. and E. Ghysels (1994). Changes in seasonal patterns: Are they cyclical? Journal

of Economic Dynamics & Control 18, 1143–1172.

Canova, F. and B. E. Hansen (1995). Are seasonal patterns constant over time? A test for

seasonal stability. Journal of Business & Economic Statistics 13, 237–252.

Cecchetti, S. G., A. K. Kashyap, and D. W. Wilcox (1997). Interactions between the seasonal

and business cycles in production and inventories. American Economic Review 87, 884–892.

Census (2007). X-12-ARIMA reference manual, version 0.3. Technical report, U.S. Census

Bureau, Washington, DC, U.S.A. http://www.census.gov/srd/www/x12a.

Page 141: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

BIBLIOGRAPHY 133

Commandeur, J. J. F. and S. J. Koopman (2007). An Introduction to State Space Time Series

Analysis. Oxford, UK: Oxford University Press.

de Jong, P. (1991). The diffuse Kalman filter. Annals of Statistics 19, 1073–83.

Diebold, F. X. and R. S. Mariano (1995). Comparing predictive accuracy. Journal of Business

& Economic Statistics 13, 253–263.

Doan, T. A. (2004). User’s Manual RATS, Version 5. Evanston, IL, USA, www.estima.com:

Estima.

Doornik, J. A. (2006). An Object-oriented Matrix Programming Language - Ox. London,

U.K.: Timberlake Consultants Press.

Doornik, J. A. (2009). Ox: An Object-oriented Matrix Programming Language, Version 6.

London, U.K.: Timberlake Consultants Press.

Durbin, J. and S. J. Koopman (2001). Time Series Analysis by State Space Methods. Oxford,

UK: Oxford University Press.

Findley, D. F. (2005). Some recent developments and directions in seasonal adjustment.

Journal of Official Statistics 21, 343–365.

Fletcher, R. (1987). Practical Methods of Optimisation, (2nd Ed.). New York: John Wiley.

Franses, P. H. and R. Paap (2004). Periodic Time Series Models. Oxford, U.K.: Oxford

University Press.

Ghysels, E. (1988). A study toward a dynamic theory of seasonality for economic time series.

Journal of the American Statistical Association 83 (401), 168–172.

Ghysels, E. (1991). Are business cycle turning points uniformly distributed throughout the

year? Cahiers de recherche 9135, Centre interuniversitaire de recherche en conomie quan-

titative, CIREQ.

Ghysels, E. (1994). On the periodic structure of the business cycle. Journal of Business &

Economic Statistics 12, 289–298.

Ghysels, E. and D. R. Osborn (2001). The Econometric Analysis of Seasonal Time Series.

Cambridge, UK: Cambridge University Press.

Gjermoe, E. (1931). Det konjunkturcycliske element i beskeftigelsesgradens sesongbevegelse.

Statsøkonomisk tidsskrift 49, 45–82.

Gladysev, E. G. (1961). Periodically correlated random sequences. Soviet Mathematics 2,

385–388.

Page 142: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

134 BIBLIOGRAPHY

Goldfeld, S. M. and R. E. Quandt (1965). Some tests for homoscedasticity. Journal of the

American Statistical Association 60, 539–547.

Gomez, V. (2001). The use of Butterworth filters for trend and cycle estimation of economic

time series. Journal of Business & Economic Statistics 19, 365–373.

Gomez, V. and A. Maravall (1996). Programs TRAMO and SEATS: Instructions for the user.

Technical Report Working paper 9628, Servicio de Estudios, Banco de Espana, Madrid.

Hannan, E. J. (1955). A test for singularities in Sydney rainfall. Australian Journal of

Physics 8, 289–297.

Harvey, A. C. (1985). Trends and cycles in macroeconomic time series. Journal of Business

& Economic Statistics 3 (3), 216–227.

Harvey, A. C. (1989). Forecasting, structural time series models and the Kalman Filter.

Cambridge, UK: Cambridge University Press.

Harvey, A. C. (2001). Testing in unobserved components models. Journal of Forecasting 20,

1–19.

Harvey, A. C. and S. J. Koopman (1997). Multivariate structural time series models. In

C. Heij, H. Schumacher, B. Hanzon, and C. Praagman (Eds.), Systematic Dynamics in

Economic and Financial Models, pp. 269–298. Chichester, UK.: Wiley.

Harvey, A. C. and S. J. Koopman (2000). Signal extraction and the formulation of unobserved

components. The Econometrics Journal 3, 84–107.

Harvey, A. C. and T. M. Trimbur (2003). General model-based filters for extracting cycles

and trends in economic time series. Review of Economics and Statistics 85, 244–255.

Harvey, D. I., S. J. Leybourne, and P. Newbold (1998). Tests for forecast encompassing.

Journal of Business & Economic Statistics 16, 254–259.

Hillmer, S. C. and G. C. Tiao (1982). An ARIMA-model-based approach to seasonal adjust-

ment. Journal of the American Statistical Association 77, 63–70.

Hindrayanto, I., J. Aston, S. J. Koopman, and M. Ooms (2010). Modeling trigonometric

seasonal components for monthly economic time series. Tinbergen Institute Discussion

Papers 10-018/4, Tinbergen Institute.

Hindrayanto, I., S. J. Koopman, and M. Ooms (2010). Exact maximum likelihood estimation

for non-stationary periodic time series models. Computational Statistics & Data Analy-

sis 54 (11), 2641 – 2654. The Fifth Special Issue on Computational Econometrics.

Hylleberg, S., R. F. Engle, C. W. J. Granger, and B. S. Yoo (1990). Seasonal integration and

cointegration. Journal of Econometrics 44, 215–238.

Page 143: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

BIBLIOGRAPHY 135

Jimenez, C., A. I. McLeod, and K. W. Hipel (1989). Kalman filter estimation for periodic

autoregressive-moving average models. Stochastic Hydrology and Hydraulics 3, 227–240.

Kim, C. J. and C. Nelson (1999). Has the U.S. economy become more stable? A Bayesian

approach based on a Markov-Switching model of the business cycle. Review of Economics

and Statistics 81, 608–616.

Kim, C. J., C. Nelson, and J. Piger (2004). The less volatile U.S. economy: A Bayesian inves-

tigation of timing, breadth, and potential explanations. Journal of Business & Economic

Statistics 22, 80–93.

Kitagawa, G. and W. Gersch (1996). Smoothness Priors Analysis of Time Series. New York:

Springer Verlag.

Koopman, S. J. (1997). Exact initial Kalman filtering and smoothing for non-stationary time

series models. Journal of American Statistical Association 92, 1630–8.

Koopman, S. J. and A. C. Harvey (2003). Computing observation weights for signal extraction

and filtering. Journal of Economic Dynamics & Control 27, 1317–1333.

Koopman, S. J. and M. Ooms (2002). Periodic structural time series models: Estimation and

forecasting with application. In Y. Kawasaki (Ed.), Proceedings of the 3rd International

Symposium on Frontiers of Time Series Modeling: Modeling Seasonality and Periodicity,

Institute of Statistical Mathematics, Tokyo, Japan, January 2002., pp. 151–172. Institute

of Statistical Mathematics, Tokyo, Japan.

Koopman, S. J. and M. Ooms (2006). Forecasting daily time series using periodic unobserved

components time series models. Computational Statistics & Data Analysis 51, 885–903.

Koopman, S. J., M. Ooms, and I. Hindrayanto (2009). Periodic unobserved cycles in seasonal

time series with an application to U.S. unemployment. Oxford Bulletin of Economics and

Statistics 71, 683–713, Issue 5.

Koopman, S. J., N. Shephard, and J. A. Doornik (1999). Statistical algorithms for models in

state space using SsfPack 2.2. The Econometrics Journal 2, 107–160, www.ssfpack.com.

Koopman, S. J., N. Shephard, and J. A. Doornik (2008). Statistical Algorithms for Models in

State Space Form - SsfPack 3.0. Timberlake Consultants Ltd, London, UK.

Krane, S. and W. Wascher (1999). The cyclical sensitivity of seasonality in u.s. unemployment.

Journal of Monetary Economics 44, 523–553.

Kurozumi, E. (2002). Testing for periodic stationarity. Econometric Reviews 21, 243–270.

Kuznets, S. (1933). Seasonal Variation in Industry and Trade. New York: National Bureau

of Economic Research.

Page 144: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

136 BIBLIOGRAPHY

Ladiray, D. and B. Quenneville (2001). Seasonal Adjustment with the X-11 Method. New

York: Springer-Verlag.

Li, W. K. and Y. V. Hui (1988). An algorithm for the exact likelihood of periodic autore-

gressive moving average models. Communications in Statistics, Simulations and Computa-

tion 17, 1483–1494.

Lund, R. and I. Basawa (2000). Recursive prediction and likelihood evaluation for periodic

ARMA models. Journal of Time Series Analysis 21, 75–93.

Maravall, A. (1985). On structural time series models and the characterization of components.

Journal of Business & Economic Statistics 3, 350–355.

Matas-Mir, A. and D. R. Osborn (2004). Does seasonality change of the business cycle? an

investigation using monthly industrial production series. European Economic Review 48,

1309–1332.

McConnell, M. and G. Perez-Quiros (2000). Output fluctuation in the United States: What

has changed since the early 1980’s? American Economic Review 90, 1464–1476.

McElroy, T. S. (2008). Matrix formulas for nonstationary ARIMA signal extraction. Econo-

metric Theory 24, 988–1009.

McElroy, T. S. and D. F. Findley (2010). Selection between models through multi-step-ahead

forecasting. Journal of Statistical Planning and Inference 140 (12), 3655 – 3675.

McLeod, A. I. (1994). Diagnostic checking of periodic autoregression models with application.

Journal of Time Series Analysis 15, 221–233.

Mendershausen, H. (1937). Annual survey of statistical technique: Method of computing and

eliminating changing seasonal fluctuations. Econometrica 5, 234–262.

Ooms, M. and P. H. Franses (1997). On periodic correlations between estimated seasonal

and nonseasonal components in German and U.S. unemployment. Journal of Business &

Economic Statistics 15, 470–481.

Osborn, D. R. (1988). Seasonality and habit persistence in a life cycle model of consumption.

Journal of Applied Econometrics 3, 255–266.

Osborn, D. R. and J. P. Smith (1989). The performance of periodic autoregressive models

in forecasting seasonal UK consumption. Journal of Business & Economic Statistics 7,

117–127.

Penzer, J. and Y. Tripodis (2007). Single-season heteroscedasticity in time series. Journal of

Forecasting 26, 189–202.

Page 145: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

BIBLIOGRAPHY 137

Plosser, C. I. (1979). The analysis of seasonal economic models. Journal of Economet-

rics 10 (2), 147–163.

Primiceri, G. (2005). Time varying structural vector autoregressions and monetary policy.

Review of Economic Studies 72, 821–852.

Proietti, T. (2004). Seasonal specific structural time series. Studies in Nonlinear Dynamics

& Econometrics 8 (2), Article 16, http://www.bepress.com/snde/vol8/iss2/art16.

Sensier, M. and D. van Dijk (2004). Testing for volatility changes in U.S. macroeconomic time

series. The Review of Economics and Statistics 86, 833–839.

Shiskin, J. and H. Eisenpress (1957). Seasonal adjustment by electronic computer methods.

Journal of the American Statistical Association 52, 415–449.

Shumway, R. H. and D. S. Stoffer (2000). Time Series Analysis and Its Applications. New

York: Springer-Verlag.

Silverman, B. W. (1986). Density Estimation for Statistical and Data Analysis. London:

Chapman & Hall.

Sims, C. and T. Zha (2006). Were there regime switches in U.S. monetary policy? American

Economic Review 96, 54–81.

Stock, J. H. and M. W. Watson (2002). Has the business cycle changed and why? In

M. Gertler and K. S. Rogoff (Eds.), NBER Macroeconomics Annual 2002, Volume 17, pp.

159–218. Cambrige, MA, U.S.A.: MIT Press.

Tiao, G. C. and M. R. Grupe (1980). Hidden periodic autoregressive-moving average models

in time series data. Biometrika 67, 365–373.

Tunnicliffe-Wilson, G. (1969). Factorization of the covariance generating function of a pure

moving average process. SIAM Journal of Numerical Analysis 6, 1–7.

Van Dijk, D., B. Strikholm, and T. Terasvirta (2003). The effects of institutional and tech-

nological change and business cycle fluctuations on seasonal patters in quarterly industrial

production series. Econometrics Journal 6, 79–98.

Varga, A. (2008). On solving periodic Riccati equations. Numerical Linear Algebra with

Application 15, 809–835.

Vecchia, A. V. (1985). Maximum likelihood estimation for periodic autoregressive moving

average models. Technometrics 27, 375–384.

Wallis, K. F. (1978). Seasonal adjustment and multiple time series analysis. In Seasonal

Analysis of Economic Time Series, NBER Chapters, pp. 347–364. National Bureau of

Economic Research, Inc.

Page 146: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

138 BIBLIOGRAPHY

Warne, A. and A. Vredin (2006). Unemployment and inflation regimes. Studies in Nonlinear

Dynamics & Econometrics 10 (2), Article 2, www.bepress.com/snde/vol10/iss2/art2.

Wisniewski, J. (1934). Interdependence of cyclical and seasonal variation. Econometrica 2,

176–184.

Yao, Q. and P. J. Brockwell (2006). Gaussian Maximum Likelihood estimation for ARMA

models. Journal of Time Series Analysis 27, 857–875.

Page 147: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Samenvatting (Summary in Dutch)

Veel tijdreeksen aangetroffen in economische, sociale en natuurwetenschappen vertonen regel-

matige seizoensgebonden fluctuaties. In modelmatige analyses van dergelijke tijdreeksen kan

de seizoensgebonden variantie direct worden opgenomen in het model of vooraf worden ver-

wijderd uit de reeks door seizoenaanpassingsmethoden die (impliciet of expliciet) gebaseerd

kunnen zijn op een seizoensmodel. In dit proefschrift worden verschillende soorten periodieke

seizoensmodellen geanalyseerd en enkele nieuwe ontwikkeld.

In periodieke analyse wordt een set van jaarlijkse tijdreeksen simultaan geanalyseerd. Elk

individuele tijdreeks is uitsluitend gerelateerd aan een bepaald seizoen. Deze aanpak staat in

contrast met de modellering van een tijdreeks als een stochastisch proces met seizoensgebonden

schommelingen. Concentratie op een tijdreeks die enkel gekoppeld is aan een bepaald seizoen,

zodanig dat de reeks seizoensvrije dynamiek bezit, omzeilt enkele ingewikkelde aspecten van

het modelleren van seizoensgebonden variaties. Periodieke benaderingen zijn onderzocht in

het kader van autoregressief voortschrijdend gemiddelde (ARMA) modellen en dynamische

econometrische modellen, zie bijvoorbeeld het boek van Ghyssels & Osborn (2001) of Franses

& Paap (2004). In het kader van periodiek latente-componenten modellen in toestandruimte

(state-space) formuleringen bouwt dit proefschrift voort op belangrijke bijdragen van Krane

& Wascher (1999), Koopman & Ooms (2002, 2006), Proietti (2004), Bell (2004) en Penzer &

Tripodis (2007).

In dit proefschrift wordt de periodieke analyse van zowel ARMA als latente-componenten

tijdreeksen modellen nader belicht. Het ARMA model is algemeen bekend en wordt behan-

deld in tal van econometrie boeken. Het latente-componenten model is minder bekend en het

ontleedt een tijdreeks in componenten van belang, zoals trend, seizoen, cyclus en residuele

component, zie de artikelen van Harvey (1989), Durbin & Koopman (2001) en Commandeur

& Koopman (2007) voor algemene verhandelingen en diepgaande beschrijvingen van dit type

model. Wij richten ons vooral op de analyse van de macro-economische tijdreeksen waarvoor

wij latente-componenten modellen uitbereiden door periodieke coefficienten toe te kennen aan

de verschillende componenten. Onze bijdrage voor de periodieke ARMA modellen is het on-

twikkelen van een state-space formulering waarin de aanname van stationariteit niet nodig

is. Verder besteden we speciale aandacht aan de identificatie van de parameters voordat

we beginnen met de schatting procedure en laten we zien dat exacte maximum likelihood

een geschikte schattingsmethode is, ondanks het grote aantal parameters dat typisch wordt

aangetroffen in deze klasse van modellen. We hebben ook een koppeling van periodieke en

Page 148: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

140 SAMENVATTING (SUMMARY IN DUTCH)

niet-periodieke model formuleringen gemaakt door middel van waarschijnlijkheidsratio toet-

sen, wat mogelijk is door de geneste structuur van de modellen. Empirische illustraties worden

gegeven door gebruik te maken van Amerikaanse werkloosheid reeksen voor univariate peri-

odieke modellen (hoofdstukken 2 en 4) en werkgelegenheid data voor multivariate periodieke

modellen (hoofdstuk 3).

Een periodieke analyse kan worden uitgevoerd door een univariate analyse te herhalen

voor ieder seizoen, of met een multivariate aanpak waarin meerdere tijdreeksen (gelijk de

seizoenslengte) tegelijk wordt gemodelleerd. In dit proefschrift pleiten we voor het gebruik

van een univariate representatie met tijdsvarierende parameters voor periodieke modellen.

We tonen in de hoofdstukken 2 en 3 aan dat deze aanpak identiek is aan de multivariate

formulering van periodieke modellen met vaste parameters. Het voordeel van het gebruik

van de univariate formulering ligt in de mogelijkheid van uitbreiding van het model voor

verschillende tijdreeksen samen zodat we multivariate periodieke modellen verkrijgen, zoals

beschreven wordt in hoofdstuk 4.

Hoofdstuk 5 van dit proefschrift beschrijft een speciaal geval van een periodiek UC model

waarin het zogenaamde basis structureel model (BSM) wordt uitgebreid door tijdsvarierende

trigonometrische seizoenstermen te introduceren voor verschillende seizoensfrequenties. We

noemen dit model een frequentie specifieke basis structureel model (FS-BSM). De uitgebreide

verzameling van parameters in de FS-modellen kan worden geschat met standaard maximum

likelihood procedures op basis van het Kalman filter. We onderzoeken hier de dynamische

eigenschappen van FS-BSM en we brengen ze in verband met de eigenschappen van het

frequentie specifieke Airline model (FS-AM), zoals beschreven in Aston, Findley, McElroy,

Wills & Martin (2007).

De periodiciteit in FS-BSM is gerestricteerd tot uitsluitend het seizoen component, zodat

elk seizoensfrequentie zijn eigen variantie heeft. Dit bijzondere model is in detail beschreven

door Harvey (1989) voor kwartaal reeksen. We breiden de analyse van dit model uit voor

de maandelijkse reeksen en we vergelijken schattingsprestaties van FS-BSM met hun FS-AM

tegenhanger. De relatie tussen coefficienten in FS-AM en FS-BSM is zeer niet-lineair en we

gebruiken numerieke technieken om de samenhang tussen beide modellen te onderzoeken.

Wij vinden via empirische studies en theoretische simulaties dat de eigenschappen van beide

modellen erg dicht bij elkaar kunnen liggen (onder bepaalde voorwaarden). Voor dit onderzoek

maken we gebruik van een database uit de Amerikaanse Census Bureau die al eerder werd

geanalyseerd met FS-AM.

Dit proefschrift bestaat naast de introductie uit vier hoofdstukken. Een samenvatting van

elk hoofdstuk volgt.

Hoofdstuk 2: Univariaat periodiek latente-componenten

modellen

In dit hoofdstuk wordt een algemene klasse van periodieke latente componenten tijdreeks

modellen met stochastische trend, seizoen, cyclus en residueel component onderzocht. De

Page 149: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

141

algemene state-space formulering van het periodieke latente-componenten model maakt exacte

maximum likelihood schatting, signaal-extractie en voorspelling mogelijk. Ook worden de

gevolgen voor model gebaseerde seizoensinvloeden besproken. Het nieuwe periodieke model

wordt toegepast op de maandelijkse naoorlogse Amerikaanse werkloosheid data, waarin wij

een significante periodieke stochastische cyclus identificeren. Een gedetailleerde empirische

periodieke analyse wordt gepresenteerd, waaronder een vergelijking tussen de periodieke en

de niet-periodieke latente-componenten modellen.

Hoofdstuk 3: Multivariaat periodiek latente-componenten

modellen

Dit hoofdstuk analyseert naoorlogse Amerikaanse kwartaal werkgelegenheid data voor zeven

industriele sectoren op basis van een nieuw multivariaat periodiek latente-componenten ti-

jdreeksmodel. Het model omvat de sectorale trend, seizoen en residuele componenten met

periodieke functies en parameter aanpassingen voor de periode van de zogenaamde ‘Great

Moderation’. Het belangrijkste kenmerk van onze multivariate model is de gemeenschappeli-

jke cyclus component die de detectie van periodiek cyclisch gedrag in de werkgelegenheidsreeks

mogelijk maakt. Het onderzoek van de periodiciteit in de cyclus component van de tijdreeks

is relevant, omdat in correctieprocedures voor seizoensinvloeden hiermee rekening moet wor-

den gehouden wanneer deze periodike cyclus aanwezig blijkt te zijn. We concluderen dat

periodiciteit aanwezig is in de sectorale trend en seizoen componenten, maar vonden geen

periodiciteit in de gemeenschappelijke cyclus component. Verder vergelijken we onze bevin-

dingen met een eerder verschenen empirische studie over dit onderwerp.

Hoofdstuk 4: Periodieke SARIMA modellen

In dit hoofdstuk worden state-space formuleringen voor periodieke SARIMA model bespro-

ken. Een praktische state-space formulering van de periodieke SARIMA modellen wordt gein-

troduceerd om model identificatie, specificatie en exacte maximum likelihood schatting van

de periodieke parameters te vergemakkelijken. Deze formuleringen behoeven geen a-priori

(seizoen) verschilname van de tijdreeks. De tijdsvarierende state-space voorstelling is een

aantrekkelijk alternatief voor de tijd-invariante vector representatie van periodieke modellen,

die meestal leidt tot een hoge dimensie van de toestand vector in maandelijkse periodieke

tijdreeks modellen. Een belangrijke ontwikkeling is onze methode voor de berekening van de

variantie-covariantie matrix van de eerste reeks van waarnemingen die nodig is voor de exacte

maximum likelihood schatting. We illustreren het gebruik van periodieke SARIMA modellen

voor het voorspellen van een maandelijkse naoorlogse Amerikaanse werkloosheid tijdreeksen.

Page 150: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

142 SAMENVATTING (SUMMARY IN DUTCH)

Hoofdstuk 5: frequentie specifiek trigonometrisch seizoen

modellen

Het BSM tijdreeks model is ontworpen voor het modelleren en voorspellen van tijdreeksen

met een seizoenscomponent. In dit hoofdstuk maken wij kennis een veralgemenisering van dit

model, waarin de tijd varierende trigonometrische termen verschillende variaties hebben in

hun storingen. De uitgebreide set van parameters worden geschat met maximum likelihood

procedures op basis van het Kalman filter. De bijdrage van dit hoofdstuk is tweeledig. De

eerste is een uitgebreide beschrijving van de dynamische eigenschappen van deze frequentie

specifieke basis structureel model. De tweede is de relatie tussen dit model en een vergelijkbare

algemene versie van het Airline model dat ontwikkeld is door het Amerikaanse Census Bureau.

Uit de berekening van een kwadratische afstand metriek op basis van de gereduceerde vorm

voortschrijdende gemiddelde representatie van de modellen, concluderen we dat de eigen-

schappen van de algemenere modellen erg dicht bij elkaar liggen, in vergelijking met hun

oorspronkelijke tegenhangers. In sommige gevallen is de afstand tussen de modellen vrijwel

nul, zodat de modellen kunnen worden beschouwd als observationeel equivalent. Een uitge-

breide empirische studie van een gedisaggregeerde maandelijkse wereldhandel reeks illustreert

zowel de relaties tussen de twee klassen van modellen als de verbeteringen die verkregen zijn

door het aannemen van de frequentie-specifieke extensies.

Page 151: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

Acknowledgements

Finishing a PhD-thesis was for me a long journey with many ups and downs. Somewhere

in spring 2003, while I was writing my master thesis, Professor Koopman asked whether I

already knew what I was going to do once I graduated. I considered two possible answers to

this simple question. The first was to return to my beloved family in Indonesia and try to find

something to do in Jakarta. The second was to apply for a job in the Netherlands in order to

gain some working experience and then return to Jakarta, hopefully with a better position in

the job market. At the time I had no solid idea what to choose and Siem Jan suggested to

take an assistant research position under his and dr. Ooms’ supervision. After two months of

thinking, I decided to take his offer, which turned out to be a major life changer since I did

not end up in Jakarta as I expected, but stayed and got employed in Amsterdam and married

a fellow graduate student. An unanticipated turn of events!

Therefore, I would first of all thank Siem Jan Koopman with all my heart for giving

me this opportunity. He always had faith in me even when I doubted myself, he patiently

corrected my English time and again without being judgemental, he kept on supporting me

with ideas for new papers and his knowledge is beyond my imagination. Well, there must be

some distinct differences between a professor and his students, I suppose. Without this very

important and special person, I never would have made it this far. I realise that I was truly

lucky to have him as my supervisor and I am thankful that Siem Jan is enthusiastic about

continuing our collaboration on future research work.

Marius Ooms is my co-promotor, but he has become more than just a supervisor to me.

Marius is very precise in his work; he could adjust my texts forever if he had time. Sometimes

he frustrated me to no end. However, I have to admit that after his well-directed corrections

(handwritten with a red pen that often took more space than my original text), the working-

paper version would look so much better than my own writing. In term of friendship, Marius

was always interested in what was going on with my life and we have shared many stories in

the past. He is an avid listener, great supervisor and a good friend. I thank him for everything

he has done for me in the past years.

Beside my supervisors, I would like to thank my thesis committee, dr. David Findley,

prof.dr. Niels Haldrup, prof.dr. Philip Hans Franses, prof.dr. Frank den Butter and dr. Ben

Vogelvang, for their time, encouragement, corrections and insightful comments to the final

draft of this thesis. Some of you have come from far away and I am very grateful for your

presence in Amsterdam.

Page 152: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

144 ACKNOWLEDGEMENTS

Next I would like to thank my former room-mates who shared the VU-office with me

for many years: Brian Wong, Borus Jungbacker, Taoying Yuan, Bahar Kaynar, Virginie

Dordonnat and Carolina Garcia-Martos. They were the first to know when there was news

and gossip flying around the campus and we made a good laugh of some of them. More

importantly, they made it bearable for me to face another research day and to keep my

head up high after a paper rejection, a disastrous tutorial class, or a failed simulation study.

Indirectly, they motivated me to finish this thesis in the end. I would also thank some (former)

finance and economics PhD-students down the corridors at the VU for their friendship and

fun conversations during lunchtime: Alex Halsema, Yang Nan, Michel van der Wel, Bernd

Schwaab, Ting Wang, Liv Zhang, Sunny Li and Maria Dementieva. I wish them all the best

wherever they are and hope we will always remain good friends.

At the start of my PhD-project, I spent most of my time at the Tinbergen Institute

Amsterdam, following a number of core and field courses. Here I met many nice fellow

students who later on became my study friends: Sander Konijn, Berly Martawardaya, Linda

Midgley, Flora Felso, Eddy Bekkers, Razvan Vlahu and Vali Trufas. I thank all of them for

their company and friendship during some tough educational years.

A lifetime ago, when I first came to the Netherlands (summer 1997), I never thought that

I would study econometrics, since I had never even heard of it. As I contemplate everything

that had happened in my educational life, there are several people that I specifically would

like to thank: (former) VASVU teachers (Aleid Knoote-Alders, Kees Smit, Wilfried Meffert,

Chris van Veen and Ruud Stumpel) who taught me the Dutch language (quite well, I’ve been

told) and introduced me to the Dutch culture so that I could stand my ground during my

undergraduate years and beyond; Rein Nobel who convinced me to choose econometrics over

other possible studies even though he did not know me well back then; Ben Vogelvang who

made me fall in love with econometrics for the first time after attending his introductory

courses and becoming his student-assistant later on; and also Gerard van de Laan, Kees van

den Hoeven and Koos Sneek who extended my contract at the VU until I finished my PhD-

research. I thank them deeply for everything they have done so that I can be the person I am

now. I also thank (former) econometric staff members: Charles Bos, Jacques Commandeur,

Joaquim Gromicho, Harold Houba, Michael Massman, Piet Blokland, Wim Lablans, Suncica

Vujic, Angeles Carnero, Ans van Ginkel and Hedda Werkman for their attention, support

and kindness during my stay at the VU.

From autumn 2007 until spring 2008 I spent some research time at the US Census Bureau

in Washington DC. I would sincerely thank David Findley, Bill Bell, Tucker McElroy, Brian

Monsell and Joanne Pascale who warmly welcomed me there, provided me with an office,

working guidance and a place to stay. I had a great and unforgettable time in USA and I

gained a lot of experience about time series analysis in practice. Chapter 5 of this thesis

is dedicated to them. In addition, I also thank John Aston from University of Warwick for

setting up this particular chapter and providing me with data and Ox-code.

What is life without friends? It must be uneventful and pretty lonely. I am grateful

that I have several true friends from the very beginning of my life in the Netherlands who

Page 153: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

145

stood by me in good and bad times, happy and sad occasions: Susanti Tang, Lili Huang,

Carolin Kalebos, Ying Zhang, Inge Stet, Sander van Meeuwen, Annelotte Kunst, Raymond

Hamersma and Sanne Zwart. During my PhD-research years I also made friends with the

regional Catholic youth group formerly led by Mirjam Borst. She might never know what she

meant to me and that is why I would like to thank her for her prayers and supports especially

when I needed them the most.

I thank a number of my new colleagues at De Nederlandsche Bank: Job Swank, Jakob

de Haan, Robert-Paul Berben, John Lewis, Wilko Bolt, Clovis Hopman, Lola Hernandez and

Chen Yeh. Even though I did not write my thesis under DNB flag, they have welcomed and

supported me from the day I started to work there. They knew that I needed some time to

finish my thesis and understood when I took several days off from work to meet my supervisors

and finish some remaining work. Every now and then they asked how things were going and

I thank them for their kind attention.

The last few paragraphs are dedicated to my family, who loved me unconditionally. My

mother and father have always been supportive at every single turn of my life and they only

wanted the best for me. When I was twelve, my father told me once that a child should

have a better standard of living than her parents and should be better educated. Since he

was a medical doctor with a master degree while his father was a high-school teacher with

no university degree, he set a good example to me to follow. He hoped that his daughters

would achieve at least the same level education as himself or preferably higher. This has

given me a strong motivation to finish my PhD since I knew how proud he would be once

I achieved my degree. Unfortunately, my father passed away a few weeks before my draft

thesis was approved by the reading committee. By writing down a piece of his story in this

acknowledgement, I dedicate my thesis to him and hope his words will not be forgotten. I

also specially thank my mother for her love and guidance since the day I was born; she never

forgets to mention me and my sister in her daily prayers.

Some special family members that I would greatly thank are my aunt Eveline and uncle

Alfred. They gave me the opportunity that radically changed my life. Without their generosity,

I would never have come to the Netherlands to study, let alone become a Dutch citizen. I will

never forget how good both of them were to me. My aunt was a living example of the phrase

‘where there is a will, there is a way’. She came to this country with nothing forty years ago

and managed to become a successful dentist until her recent retirement. She did not give up

when all doors seemed to be closed to her and tried her very best to find another opening. I

learned from her not to stop trying once I set my goals in life. Thank you for being there for

me in the past fourteen years.

The last but definitely not the least person that I would like to mention is my husband,

Kai-Ming, as he deserves the biggest thanks of all. I could not mention everything he had

done for me since there were simply too many. This thesis would not be finished as it is now

without his help. Kai-Ming, you are my rock and my hope, in the past, present and future. I

love you and together we are ready to face the world.

Page 154: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

The Tinbergen Institute is the Institute for Economic Research, which was founded in 1987

by the Faculties of Economics and Econometrics of the Erasmus University Rotterdam, Uni-

versity of Amsterdam and VU University Amsterdam. The Institute is named after the late

Professor Jan Tinbergen, Dutch Nobel Prize laureate in economics in 1969. The Tinbergen

Institute is located in Amsterdam and Rotterdam. The following books recently appeared in

the Tinbergen Institute Research Series:

453. S. MEENTS, The Influence of Sellers and the Intermediary on Buyers Trust in C2C

Electronic Marketplaces.

454. S. VUJIC, Econometric Studies to the Economic and Social Factors of Crime.

455. F. HEUKELOM, Kahneman and Tversky and the Making of Behavioral Economics.

456. G. BUDAI-BALKE, Operations Research Models for Scheduling Railway Infrastructure

Maintenance.

457. T.R. DANIELS, Rationalised Panics: The Consequences of Strategic Uncertainty dur-

ing Financial Crises.

458. A. VAN DIJK, Essays on Finite Mixture Models.

459. C.P.B.J. VAN KLAVEREN, The Intra-household Allocation of Time.

460. O.E. JONKEREN, Adaptation to Climate Change in Inland Waterway Transport.

461. S.C. GO, Marine Insurance in the Netherlands 1600-1870, A Comparative Institutional

Approach.

462. J. NIEMCZYK, Consequences and Detection of Invalid Exogeneity Conditions.

463. I. BOS, Incomplete Cartels and Antitrust Policy: Incidence and Detection.

464. M. KRAWCZYK, Affect and risk in social interactions and individual decision-making.

465. T.C. LIN, Three Essays on Empirical Asset Pricing.

466. J.A, BOLHAAR, Health Insurance: Selection, Incentives and Search.

467. T. FARENHORST-YUAN, Efficient Simulation Algorithms for Optimization of Dis-

crete Event Based on Measure Valued Differentiation.

468. M.I. OCHEA, Essays on Nonlinear Evolutionary Game Dynamics.

469. J.L.W. VAN KIPPERSLUIS, Understanding Socioeconomic Differences in Health: An

Economic Approach.

470. A. AL-IBRAHIM, Dynamic Delay Management at Railways: A Semi-Markovian Deci-

sion Approach.

Page 155: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

471. R.P. FABER, Prices and Price Setting.

472. J. HUANG, Education and Social Capital: Empirical Evidences from Microeconomic

Analyses.

473. J.W. VAN DER STRAATEN, Essays on Urban Amenities and Location Choice.

474. K.M. LEE, Filtering Non Linear State Space Models: Methods and Economic Applica-

tions.

475. M.J. REINDERS, Managing Consumer Resistance to Innovations.

476. A. PARAKHONYAK, Essays on Consumer Search, Dynamic Competition and Regula-

tion.

477. S. GUPTA, The Study of Impact of Early Life Conditions on Later Life Events: A Look

Across the Individuals Life Course.

478. J. LIU, Breaking the Ice between Government and Business: From IT Enabled Control

Procedure Redesign to Trusted Relationship Building.

479. D. RUSINOVA, Economic Development and Growth in Transition Countries.

480. H. WU, Essays on Top Management and corporate behavior.

481. X. LIU, Three Essays on Real Estate Finance.

482. E.L.W. JONGEN, Modelling the Impact of Labour Market Policies in the Netherlands.

483. M.J. SMIT, Agglomeration and Innovations: Evidence from Dutch Microdata.

484. S. VAN BEKKUM, What is Wrong With Pricing Errors? Essays on Value Price Di-

vergence.

485. X. HU, Essays on Auctions.

486. A.A. DUBOVIK, Economic Dances for Two (and Three).

487. A.M. LIZYAYEV, Stochastic Dominance in Portfolio Analysis and Asset Pricing.

488. B. SCHWAAB, Credit Risk and State Space Methods.

489. N. BASTRK, Essays on parameter heterogeneity and model uncertainty.

490. E. GUTIRREZ PUIGARNAU, Labour markets, commuting and company cars.

491. M.W. VORAGE, The Politics of Entry.

492. A.N. HALSEMA, Essays on Resource Management: Ownership, Market Structures and

Exhaustibility.

Page 156: Periodic Seasonal Time Series Models with applications to U.S. … · 2 CHAPTER 1. INTRODUCTION with the di erent components. Our contribution for periodic ARMA models lies in a convenient

493. R.E. VLAHU, Three Essays on Banking.

494. N.E. VIKANDER, Essays on Teams and the Social Side of Consumption.

495. E. DEMIREL, Economic Models for Inland Navigation in the Context of Climate Change.

496. V.A.C. VAN DEN BERG, Congestion pricing with Heterogeneous travellers.

497. E.R. DE WIT, Liquidity and Price Discovery in Real Estate Assets.

498. C. LEE, Psychological Aspects of the Disposition Effect: An Experimental Investigation.

499. MHA. RIDHWAN, Regional Dimensions of Monetary Policy in Indonesia.

500. J. GARCA, The moral herd: Groups and the Evolution of Altruism and Cooperation.

501. F.H. LAMP, Essays in Corporate Finance and Accounting.

502. J. SOL, Incentives and Social Relations in the Workplace.