presentation of smc^2 at bisp7
DESCRIPTION
This a short presentation for a 15 minutes talk at Bayesian Inference for Stochastic Processes 7, on the SMC^2 algorithm.http://arxiv.org/abs/1101.1528TRANSCRIPT
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
SMC2: A sequential Monte Carlo algorithm withparticle Markov chain Monte Carlo updates
N. CHOPIN1, P.E. JACOB2, & O. PAPASPILIOPOULOS3
BISP7 – September, 2011
1ENSAE-CREST2CREST & Universite Paris Dauphine, funded by AXA research3Universitat Pompeu Fabra
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 1/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
State Space Models
A system of equations
Hidden states: p(x1|θ) = µθ(x1) and for t = 1, . . . ,T :
p(xt+1|x1:t , θ) = fθ(xt+1|xt)
Observations:
p(yt |y1:t−1, x1:t−1, θ) = gθ(yt |xt)
Parameter: θ ∈ Θ, prior p(θ).
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 2/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Sequential Monte Carlo for filtering
Suppose we are interested in pθ(xT |y1:T ), for a given θ.
General idea
Sample recursively from pθ(xt |y1:t) to pθ(xt+1|y1:t+1).
After the SMC run, we can approximate the likelihood:
ZT (θ) = p(y1:T |θ) =
(T∏
t=2
p(yt |y1:t−1, θ)
)p(y1|θ)
with an unbiased estimate ZNxT (θ).
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 3/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Sequential Monte Carlo Samplers
Same kind of method but to perform bayesian inference:
p(θ|y1:T )
General idea
Sample recursively from p(θ|y1:t) to p(θ|y1:t+1).
MCMC moves to diversify the particles.
Requires the ability to compute point-wise p(yt |y1:t−1, θ).
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 4/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Idealized Metropolis–Hastings for SSM
Motivation
Bayesian parameter inference in state space models:
p(θ|y1:T )
If only. . .
. . . we could compute p(θ|y1:T ) ∝ p(θ)p(y1:T |θ), we could run aMH algorithm.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 5/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Valid Metropolis–Hastings for SSM
Plug in estimates
We have ZNxT (θ) ≈ p(y1:T |θ) by running a SMC filter, and we can
try to run a MH algorithm using the estimate instead of the rightlikelihood.
Particle MCMC
This is called Particle Marginal Metropolis-Hastings, by Andrieu,Doucet and Holenstein.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 6/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Our contribution. . .
. . . was to use the same method to get a valid SMC sampler forstate space models.
Foreseen benefits
to sample more efficiently from the posterior distributionp(θ|y1:T ),
to sample sequentially from p(θ|y1), p(θ|y1, y2), . . . p(θ|y1:T ).
and it turns out, it allows even a bit more.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 7/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Valid SMC sampler for SSM
Plug in estimates
Similarly to PMCMC methods, we want to replace
p(yt |y1:t−1, θ)
with an unbiased estimate, and see what happens.
SMC everywhere
We associate Nx x-particles to each of the Nθ θ-particles,
these are used to get estimates of the incremental likelihoodsfor each θ-particle.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 8/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Side benefits
Evidence
SMC2 provides an estimate of the “evidence”:
p(y1:t) =t∏
s=1
p(ys |y1:s−1)
Automatic tuning
θ-particles are moved with adaptive particle MCMC steps,
the number of Nx particles can be dynamically increased ifneed be.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 9/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Numerical illustrations: Stochastic Volatility
Time
Obs
erva
tions
−4
−2
0
2
100 200 300 400 500 600 700
Figure: The S&P 500 data from 03/01/2005 to 21/12/2007.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 10/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Numerical illustrations: Stochastic Volatility
Stochastic Volatility model
Observations (“log returns”):
yt = µ+ βvt + v1/2t εt , εt ∼ N (0, 1)
Hidden states: the “actual volatility” (vt), a process thatdepends on another process, the “spot volatility” (zt).
All these processes are parameterized by θ ∈ (µ, β, ξ, ω2, λ).
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 11/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Numerical illustrations: Stochastic Volatility
µ
Den
sity
0
2
4
6
8
T = 250
−1.0 −0.5 0.0 0.5 1.0
T = 500
−1.0 −0.5 0.0 0.5 1.0
T = 750
−1.0 −0.5 0.0 0.5 1.0
T = 1000
−1.0 −0.5 0.0 0.5 1.0
Figure: Concentration of the posterior distribution for parameter µ.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 12/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Numerical illustrations: Stochastic Volatility
Model comparison
For the same problem there could be various models that we wantto compare. Here:
the “basic” previous model,
a similar model with more factors (= more hidden states),
a similar model with more factors and “leverage” (= differentlikelihood function with more parameters).
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 13/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Numerical illustrations: Stochastic Volatility
Time
Squ
ared
obs
erva
tions
5
10
15
20
100 200 300 400 500 600 700
(a)
Iterations
Evi
denc
e co
mpa
red
to th
e on
e fa
ctor
mod
el
−2
0
2
4
100 200 300 400 500 600 700
variableMulti factor without leverageMulti factor with leverage
(b)
Figure: Left: observations; right: log-evidence relative to the basic model.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 14/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Conclusion
A powerful framework
The SMC2 framework allows to obtain various quantities ofinterest, especially for sequential analysis.
It extends the PMCMC framework introduced by Andrieu,Doucet and Holenstein.
A python package is available:
http://code.google.com/p/py-smc2/.
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 15/ 16
Introduction and State Space ModelsQuick reminder on Sequential Monte Carlo
Particle Markov Chain Monte CarloSMC2
Bibliography
SMC2: A sequential Monte Carlo algorithm with particle Markovchain Monte Carlo updates, N. Chopin, P.E. Jacob, O.Papaspiliopoulos, submitted, available on arXiv.Main references:
Particle Markov Chain Monte Carlo methods, C. Andrieu, A.Doucet, R. Holenstein, JRSS B., 2010, 72(3):269–342
The pseudo-marginal approach for efficient computation, C.Andrieu, G.O. Roberts, Ann. Statist., 2009, 37, 697–725
Random weight particle filtering of continuous time processes,P. Fearnhead, O. Papaspiliopoulos, G.O. Roberts, A. Stuart,JRSS B., 2010, 72:497–513
Feynman-Kac Formulae, P. Del Moral, Springer
N. CHOPIN, P.E. JACOB, & O. PAPASPILIOPOULOS SMC2 16/ 16