temporal probabilistic models

68
TEMPORAL PROBABILISTIC MODELS

Upload: karena

Post on 23-Feb-2016

38 views

Category:

Documents


0 download

DESCRIPTION

Temporal Probabilistic Models. Motivation. Observing a stream of data Monitoring (of people, computer systems, etc ) Surveillance, tracking Finance & economics Science Questions: Modeling & forecasting Unobserved variables. Time Series Modeling. Time occurs in steps t=0,1,2,… - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Temporal Probabilistic Models

TEMPORAL PROBABILISTIC MODELS

Page 2: Temporal Probabilistic Models

MOTIVATION Observing a stream of data

Monitoring (of people, computer systems, etc)

Surveillance, tracking Finance & economics Science

Questions: Modeling & forecasting Unobserved variables

Page 3: Temporal Probabilistic Models

TIME SERIES MODELING Time occurs in steps t=0,1,2,…

Time step can be seconds, days, years, etc State variable Xt, t=0,1,2,… For partially observed problems, we see

observations Ot, t=1,2,… and do not see the X’s X’s are hidden variables (aka latent variables)

Page 4: Temporal Probabilistic Models

MODELING TIME Arrow of time

Causality? Bayesian networks to the rescue

Causes Effects

Page 5: Temporal Probabilistic Models

PROBABILISTIC MODELING For now, assume fully observable case

What parents?

X0 X1 X2 X3

X0 X1 X2 X3

Page 6: Temporal Probabilistic Models

MARKOV ASSUMPTION Assume Xt+k is independent of all Xi for i<t

P(Xt+k | X0,…,Xt+k-1) = P(Xt+k | Xt,…,Xt+k-1) K-th order Markov Chain

X0 X1 X2 X3

X0 X1 X2 X3

X0 X1 X2 X3

X0 X1 X2 X3

Order 0

Order 1

Order 2

Order 3

Page 7: Temporal Probabilistic Models

1ST ORDER MARKOV CHAIN MC’s of order k>1 can be converted into a

1st order MC[left as exercise]

So w.o.l.o.g., “MC” refers to a 1st order MC

X0 X1 X2 X3

Page 8: Temporal Probabilistic Models

INFERENCE IN MC What independence relationships can we

read from the BN?

X0 X1 X2 X3

Observe X1

X0 independent of X2, X3, …

P(Xt|Xt-1) known as transition model

Page 9: Temporal Probabilistic Models

INFERENCE IN MC Prediction: the probability of future state?

P(Xt) = Sx0,…,xt-1P (X0,…,Xt) = Sx0,…,xt-1P (X0) Px1,…,xt P(Xi|Xi-1)= Sxt-1P(Xt|Xt-1) P(Xt-1)

“Blurs” over time, and approaches stationary distribution as t grows Limited prediction power Rate of blurring known as mixing time

[Incremental approach]

Page 10: Temporal Probabilistic Models
Page 11: Temporal Probabilistic Models

HOW DOES THE MARKOV ASSUMPTION AFFECT THE CHOICE OF STATE? Suppose we’re tracking a point (x,y) in 2D What if the point is…

A momentumless particlesubject to thermal vibration?

A particle with velocity? A particle with intent, like

a person?

Page 12: Temporal Probabilistic Models

HOW DOES THE MARKOV ASSUMPTION AFFECT THE CHOICE OF STATE? Suppose the point is the position of our robot,

and we observe velocity and intent What if:

Terrain conditions affectspeed?

Battery level affects speed? Position is noisy, e.g. GPS?

Page 13: Temporal Probabilistic Models

IS THE MARKOV ASSUMPTION APPROPRIATE FOR: A car on a slippery road? Sales of toothpaste? The stock market?

Page 14: Temporal Probabilistic Models

HISTORY DEPENDENCE In Markov models, the state must be chosen

so that the future is independent of history given the current state

Often this requires adding variables that cannot be directly observed

Page 15: Temporal Probabilistic Models

PARTIAL OBSERVABILITY Hidden Markov Model (HMM)

X0 X1 X2 X3

O1 O2 O3

Hidden state variables

Observed variables

P(Ot|Xt) called the observation model (or sensor model)

Page 16: Temporal Probabilistic Models

INFERENCE IN HMMS Filtering Prediction Smoothing, aka hindsight Most likely explanation

X0 X1 X2 X3

O1 O2 O3

Page 17: Temporal Probabilistic Models

INFERENCE IN HMMS Filtering Prediction Smoothing, aka hindsight Most likely explanation

X0 X1 X2

O1 O2

Query variable

Page 18: Temporal Probabilistic Models

FILTERING Name comes from signal processing P(Xt|o1:t) = Sxt-1 P(xt-1|o1:t-1) P(Xt|xt-1,ot) P(Xt|Xt-1,ot) = P(ot|Xt-1,Xt)P(Xt|Xt-1)/P(ot|Xt-1)

= a P(ot|Xt)P(Xt|Xt-1)

X0 X1 X2

O1 O2

Query variable

Page 19: Temporal Probabilistic Models

FILTERING P(Xt|o1:t) = a Sxt-1P(xt-1|o1:t-1) P(ot|Xt)P(Xt|xt-1) Forward recursion If we keep track of P(Xt|o1:t)

=> O(1) updates for all t!

X0 X1 X2

O1 O2

Query variable

Page 20: Temporal Probabilistic Models

INFERENCE IN HMMS Filtering Prediction Smoothing, aka hindsight Most likely explanation

X0 X1 X2 X3

O1 O2 O3

Query

Page 21: Temporal Probabilistic Models

PREDICTION P(Xt+k|o1:t) 2 steps: P(Xt|o1:t), then P(Xt+k|Xt) Filter then predict as with standard MC

X0 X1 X2 X3

O1 O2 O3

Query

Page 22: Temporal Probabilistic Models

INFERENCE IN HMMS Filtering Prediction Smoothing, aka hindsight Most likely explanation

X0 X1 X2 X3

O1 O2 O3

Query

Page 23: Temporal Probabilistic Models

SMOOTHING P(Xk|o1:t) for k < t P(Xk|o1:k,ok+1:t)

= P(ok+1:t|Xk,o1:k)P(Xk|o1:k)/P(ok+1:t|o1:k)= a P(ok+1:t|Xk)P(Xk|o1:k)

X0 X1 X2 X3

O1 O2 O3

Query

Standard filtering to time k

Page 24: Temporal Probabilistic Models

SMOOTHING Computing P(ok+1:t|Xk) P(ok+1:t|Xk) = Sxk+1P(ok+1:t|Xk,xk+1) P(xk+1|Xk)

= Sxk+1P(ok+1:t|xk+1) P(xk+1|Xk)= Sxk+1P(ok+2:t|xk+1)P(ok+1|xk+1)P(xk+1|Xk)

X0 X1 X2 X3

O1 O2 O3

Given prior states

What’s the probability of this sequence?

Backward recursion

Page 25: Temporal Probabilistic Models

INFERENCE IN HMMS Filtering Prediction Smoothing, aka hindsight Most likely explanation

X0 X1 X2 X3

O1 O2 O3

Query returns a path through state space x0,…,x3

Page 26: Temporal Probabilistic Models

MLE: VITERBI ALGORITHM Recursive computation of max likelihood of

path to all xt in Val(Xt) mt(Xt) = maxx1:t-1 P(x1,…,xt-1,Xt|o1:t)

=a P(ot|Xt) maxxt-1P(Xt|xt-1) mt-1(xt-1) Previous ML state

argmaxxt-1P(Xt|xt-1) mt-1(xt-1)

Page 27: Temporal Probabilistic Models

APPLICATIONS OF HMMS IN NLP Speech recognition Hidden phones

(e.g., ah eh ee th r) Observed, noisy acoustic

features (produced by signal processing)

Page 28: Temporal Probabilistic Models

PHONE OBSERVATION MODELS

Phonet

Signal processing

Features(24,13,3,59)

Featurest

Model defined to be robust over variations in accent, speed, pitch, noise

Page 29: Temporal Probabilistic Models

PHONE TRANSITION MODELS

Phonet

Featurest

Good models will capture (among other things):Pronunciation of wordsSubphone structureCoarticulation effects Triphone models = order 3 Markov chain

Phonet+1

Page 30: Temporal Probabilistic Models

WORD SEGMENTATION Words run together when

pronounced Unigrams P(wi) Bigrams P(wi|wi-1) Trigrams P(wi|wi-1,wi-2)

Logical are as confusion a may right tries agent goal the was diesel more object then information-gathering search is

Planning purely diagnostic expert systems are very similar computational approach would be represented compactly using tic tac toe a predicate

Planning and scheduling are integrated the success of naïve bayes model is just a possible prior source by that time

Random 20 word samples from R&N using N-gram models

Page 31: Temporal Probabilistic Models

TRICKS TO IMPROVE RECOGNITION Narrow the # of variables

Digits, yes/no, phone tree Training with real user data

Real story: “Yes ma’am”

Page 32: Temporal Probabilistic Models

KALMAN FILTERING In a nutshell

Efficient filtering in continuous state spaces

Gaussian transition and observation models

Ubiquitous for tracking with noisy sensors, e.g. radar, GPS, cameras

Page 33: Temporal Probabilistic Models

HIDDEN MARKOV MODEL FOR ROBOT LOCALIZATION Use observations to get a better idea of

where the robot is at time t

X0 X1 X2 X3

z1 z2 z3

Hidden state variables

Observed variables

Predict – observe – predict – observe…

Page 34: Temporal Probabilistic Models

LINEAR GAUSSIAN TRANSITION MODEL Consider position and velocity xt, vt Time step h Without noise

xt+1 = xt + h vt

vt+1 = vt With Gaussian noise of std s1

P(xt+1|xt) exp(-(xt+1 – (xt + h vt))2/(2s12)

i.e. xt+1 ~ N(xt + h vt, s1)

Page 35: Temporal Probabilistic Models

LINEAR GAUSSIAN TRANSITION MODEL If prior on position is Gaussian, then the

posterior is also Gaussian

vh s1

N(m,s) N(m+vh,s+s1)

Page 36: Temporal Probabilistic Models

LINEAR GAUSSIAN OBSERVATION MODEL Position observation zt Gaussian noise of std s2

zt ~ N(xt,s2)

Page 37: Temporal Probabilistic Models

LINEAR GAUSSIAN OBSERVATION MODEL If prior on position is Gaussian, then the

posterior is also Gaussian

m (s2z+s22m)/(s2+s2

2)

s2 s2s22/(s2+s2

2)

Position prior

Posterior probability

Observation probability

Page 38: Temporal Probabilistic Models

MULTIVARIATE CASE Transition matrix F, covariance Sx Observation matrix H, covariance Sz

mt+1 = F mt + Kt+1(zt+1 – HFmt)St+1 = (I - Kt+1)(FStFT + Sx)

WhereKt+1= (FStFT + Sx)HT(H(FStFT + Sx)HT +Sz)-1

Got that memorized?

Page 39: Temporal Probabilistic Models

PROPERTIES OF KALMAN FILTER Optimal Bayesian estimate for linear

Gaussian transition/observation models Need estimates of covariance… model

identification necessary Extensions to nonlinear

transition/observation models work as long as they aren’t too nonlinear Extended Kalman Filter Unscented Kalman Filter

Page 40: Temporal Probabilistic Models

PROPERTIES OF KALMAN FILTER Optimal Bayesian estimate for linear

Gaussian transition/observation models Need estimates of covariance… model

identification necessary Extensions to nonlinear systems

Extended Kalman Filter: linearize models Unscented Kalman Filter: pass points through

nonlinear model to reconstruct gaussian Work as long as systems aren’t too nonlinear

Page 41: Temporal Probabilistic Models

NON-GAUSSIAN DISTRIBUTIONS Gaussian distributions are a “lump”

Kalman filter estimate

Page 42: Temporal Probabilistic Models

NON-GAUSSIAN DISTRIBUTIONS Integrating continuous and discrete states

Splitting with a binary choice

“up”

“down”

Page 43: Temporal Probabilistic Models

EXAMPLE: FAILURE DETECTION Consider a battery meter sensor

Battery = true level of battery BMeter = sensor reading

Transient failures: send garbage at time t Persistent failures: send garbage forever

Page 44: Temporal Probabilistic Models

EXAMPLE: FAILURE DETECTION Consider a battery meter sensor

Battery = true level of battery BMeter = sensor reading

Transient failures: send garbage at time t 5555500555…

Persistent failures: sensor is broken 5555500000…

Page 45: Temporal Probabilistic Models

DYNAMIC BAYESIAN NETWORK

BMetert

BatterytBatteryt-1

BMetert ~ N(Batteryt,s)

(Think of this structure “unrolled” forever…)

Page 46: Temporal Probabilistic Models

DYNAMIC BAYESIAN NETWORK

BMetert

BatterytBatteryt-1

BMetert ~ N(Batteryt,s)

P(BMetert=0 | Batteryt=5) = 0.03Transient failure model

Page 47: Temporal Probabilistic Models

RESULTS ON TRANSIENT FAILUREE

(Bat

tery

t)

Transient failure occurs

Without model

With model

Meter reads 55555005555…

Page 48: Temporal Probabilistic Models

RESULTS ON PERSISTENT FAILUREE

(Bat

tery

t)

Persistent failure occurs

With transient model

Meter reads 5555500000…

Page 49: Temporal Probabilistic Models

PERSISTENT FAILURE MODEL

BMetert

BatterytBatteryt-1

BMetert ~ N(Batteryt,s)

P(BMetert=0 | Batteryt=5) = 0.03

Brokent-1 Brokent

P(BMetert=0 | Brokent) = 1

Example of a Dynamic Bayesian Network (DBN)

Page 50: Temporal Probabilistic Models

RESULTS ON PERSISTENT FAILUREE

(Bat

tery

t)

Persistent failure occurs

With transient model

Meter reads 5555500000…

With persistent failure model

Page 51: Temporal Probabilistic Models

HOW TO PERFORM INFERENCE ON DBN? Exact inference on “unrolled” BN

Variable Elimination – eliminate old time steps After a few time steps, all variables in the state

space become dependent! Lost sparsity structure

Approximate inference Particle Filtering

Page 52: Temporal Probabilistic Models

PARTICLE FILTERING (AKA SEQUENTIAL MONTE CARLO)

Represent distributions as a set of particles

Applicable to non-gaussian high-D distributions

Convenient implementations

Widely used in vision, robotics

Page 53: Temporal Probabilistic Models

PARTICLE REPRESENTATION

Bel(xt) = {(wk,xk)} wk are weights, xk are state

hypotheses Weights sum to 1 Approximates the underlying

distribution

Page 54: Temporal Probabilistic Models

Weighted resampling step

PARTICLE FILTERING Represent a distribution at time t as a set of

N “particles” St1,…,St

N

Repeat for t=0,1,2,… Sample S[i] from P(Xt+1|Xt=St

i) for all i Compute weight w[i] = P(e|Xt+1=S[i]) for all i Sample St+1

i from S[.] according to weights w[.]

Page 55: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Sampling step

Page 56: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Suppose we now observe BMeter=0

P(BMeter=0|sample) = ?0.03

1

Page 57: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Compute weights (drawn as particle size)

P(BMeter=0|sample) = ?0.03

1

Page 58: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Weighted resampling

P(BMeter=0|sample) = ?

Page 59: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Sampling Step

Page 60: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Now observe BMetert = 5

Page 61: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Compute weights

10

Page 62: Temporal Probabilistic Models

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Weighted resample

Page 63: Temporal Probabilistic Models

APPLICATIONS OF PARTICLE FILTERING IN ROBOTICSSimultaneous Localization and

Mapping (SLAM)Observations: laser rangefinderState variables: position, walls

Page 64: Temporal Probabilistic Models

SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM)

Mobile robotsOdometry

Locally accurateDrifts significantly over

timeVision/ladar/sonar

Inaccurate locallyGlobal reference frame

Combine the twoState: (robot pose, map)Observations: (sensor

input)

Page 65: Temporal Probabilistic Models

COUPLE OF PLUGS CSCI B553 CSCI B659: Principles of Intelligent Robot

Motion http://cs.indiana.edu/classes/b659-hauserk

CSCI B657: Computer Vision David Crandall/Chen Yu

Page 66: Temporal Probabilistic Models

NEXT TIME Learning distributions from data Read R&N 20.1-3

Page 67: Temporal Probabilistic Models

MLE: VITERBI ALGORITHM Recursive computation of max likelihood of

path to all xt in Val(Xt) mt(Xt) = maxx1:t-1 P(x1,…,xt-1,Xt|o1:t)

=a P(ot|Xt) maxxt-1P(Xt|xt-1) mt-1(xt-1) Previous ML state

argmaxxt-1P(Xt|xt-1) mt-1(xt-1)

Does this sound familiar?

Page 68: Temporal Probabilistic Models

MLE: VITERBI ALGORITHM Do the “logarithm trick” log mt(Xt) = log a P(ot|Xt)

+ maxxt-1 [log P(Xt|xt-1) + log mt-1(xt-1) ] View:

log a P(ot|Xt) as a reward log P(Xt|xt-1) as a cost log mt(Xt) as a value function

Bellman equation