temporal probabilistic models pt 2

54
TEMPORAL PROBABILISTIC MODELS PT 2

Upload: yetty

Post on 23-Feb-2016

39 views

Category:

Documents


0 download

DESCRIPTION

Temporal Probabilistic Models Pt 2. Agenda. Kalman filtering Dynamic Bayesian Networks Particle filtering. Kalman Filtering. In a nutshell Efficient filtering in continuous state spaces Gaussian transition and observation models - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Temporal Probabilistic Models  Pt  2

TEMPORAL PROBABILISTIC MODELS PT 2

Page 2: Temporal Probabilistic Models  Pt  2

AGENDA Kalman filtering Dynamic Bayesian Networks Particle filtering

Page 3: Temporal Probabilistic Models  Pt  2

KALMAN FILTERING In a nutshell

Efficient filtering in continuous state spaces

Gaussian transition and observation models

Ubiquitous for tracking with noisy sensors, e.g. radar, GPS, cameras

Page 4: Temporal Probabilistic Models  Pt  2

HIDDEN MARKOV MODEL FOR ROBOT LOCALIZATION Use observations + transition dynamics to

get a better idea of where the robot is at time t

X0 X1 X2 X3

z1 z2 z3

Hidden state variables

Observed variables

Predict – observe – predict – observe…

Page 5: Temporal Probabilistic Models  Pt  2

HIDDEN MARKOV MODEL FOR ROBOT LOCALIZATION Use observations + transition dynamics to

get a better idea of where the robot is at time t

Maintain a belief state bt over time bt(x) = P(Xt=x|z1:t)

X0 X1 X2 X3

z1 z2 z3

Hidden state variables

Observed variables

Predict – observe – predict – observe…

Page 6: Temporal Probabilistic Models  Pt  2

BAYESIAN FILTERING WITH BELIEF STATES Compute bt, given zt and prior belief bt Recursive filtering equation

Page 7: Temporal Probabilistic Models  Pt  2

Update via the observation ztPredict P(Xt|z1:t-1) using dynamics alone

BAYESIAN FILTERING WITH BELIEF STATES Compute bt, given zt and prior belief bt Recursive filtering equation

Page 8: Temporal Probabilistic Models  Pt  2

IN CONTINUOUS STATE SPACES… Compute bt, given zt and prior belief bt Continuous filtering equation

Page 9: Temporal Probabilistic Models  Pt  2

GENERAL BAYESIAN FILTERING IN CONTINUOUS STATE SPACES Compute bt, given zt and prior belief bt Continuous filtering equation

How to evaluate this integral? How to calculate Z? How to even represent a belief state?

Page 10: Temporal Probabilistic Models  Pt  2

KEY REPRESENTATIONAL DECISIONS Pick a method for representing distributions

Discrete: tables Continuous: fixed parameterized classes vs.

particle-based techniques Devise methods to perform key calculations

(marginalization, conditioning) on the representation Exact or approximate?

Page 11: Temporal Probabilistic Models  Pt  2

GAUSSIAN DISTRIBUTION Mean m, standard deviation s Distribution is denoted N(m,s) If X ~ N(m,s), then

With a normalization factor

Page 12: Temporal Probabilistic Models  Pt  2

LINEAR GAUSSIAN TRANSITION MODEL FOR MOVING 1D POINT Consider position and velocity xt, vt Time step h Without noise

xt+1 = xt + h vt

vt+1 = vt

With Gaussian noise of std s1

P(xt+1|xt) exp(-(xt+1 – (xt + h vt))2/(2s12)

i.e. Xt+1 ~ N(xt + h vt, s1)

Page 13: Temporal Probabilistic Models  Pt  2

LINEAR GAUSSIAN TRANSITION MODEL If prior on position is Gaussian, then the

posterior is also Gaussian

vh s1

N(m,s) N(m+vh,s+s1)

Page 14: Temporal Probabilistic Models  Pt  2

LINEAR GAUSSIAN OBSERVATION MODEL Position observation zt Gaussian noise of std s2

zt ~ N(xt,s2)

Page 15: Temporal Probabilistic Models  Pt  2

LINEAR GAUSSIAN OBSERVATION MODEL If prior on position is Gaussian, then the posterior

is also Gaussian

m (s2z+s22m)/(s2+s2

2)

s2 s2s22/(s2+s2

2)

Position prior

Posterior probability

Observation probability

Page 16: Temporal Probabilistic Models  Pt  2

MULTIVARIATE GAUSSIANS Multivariate analog in N-D space Mean (vector) m, covariance (matrix) S

With a normalization factor

X ~ N(m,S)

Page 17: Temporal Probabilistic Models  Pt  2

MULTIVARIATE LINEAR GAUSSIAN PROCESS A linear transformation + multivariate

Gaussian noise If prior state distribution is Gaussian, then

posterior state distribution is Gaussian If we observe one component of a Gaussian,

then its posterior is also Gaussian

y = A x + e e ~ N(m,S)

Page 18: Temporal Probabilistic Models  Pt  2

MULTIVARIATE COMPUTATIONS Linear transformations of gaussians

If x ~ N(m,S), y = A x + bThen y ~ N(Am+b, ASAT)

Consequence If x ~ N(mx,Sx), y ~ N(my,Sy), z=x+yThen z ~ N(mx+my,Sx+Sy)

Conditional of gaussian If [x1,x2] ~ N([m1 m2],[S11,S12;S21,S22])Then on observing x2=z, we have

x1 ~ N(m1-S12S22-1(z-m2), S11-S12S22

-1S21)

Page 19: Temporal Probabilistic Models  Pt  2

KALMAN FILTER ASSUMPTIONS xt ~ N(mx,Sx) xt+1 = F xt + g + v zt+1 = H xt+1 + w v ~ N(0,Sv), w ~ N(0,Sw)

Dynamics noise

Observation noise

Page 20: Temporal Probabilistic Models  Pt  2

TWO STEPS Maintain mt, St the parameters of the gaussian

distribution over state xt Predict

Compute distribution of xt+1 using dynamics model alone

Update (observe zt+1) Compute P(xt+1|zt+1) with Bayes rule

Page 21: Temporal Probabilistic Models  Pt  2

TWO STEPS Maintain mt, St the parameters of the gaussian

distribution over state xt Predict

Compute distribution of xt+1 using dynamics model alone

xt+1 ~ N(Fmt + g, F St FT + Sv)

Let these be N(m’,S’) Update

Compute P(xt+1|zt+1) with Bayes rule

Page 22: Temporal Probabilistic Models  Pt  2

TWO STEPS Maintain mt, St the parameters of the gaussian

distribution over state xt Predict

Compute distribution of xt+1 using dynamics model alone

xt+1 ~ N(Fmt + g, F St FT + Sv)

Let these be N(m’,S’) Update

Compute P(xt+1|zt+1) with Bayes rule Parameters of final distribution mt+1 and St+1

derived using the conditional distribution formulas

Page 23: Temporal Probabilistic Models  Pt  2

DERIVING THE UPDATE RULE

xt

zt

m’a= N ( , )S’ B

BT Cxt ~ N(m’ , S’)

(1) Unknowns a,B,C

(3) Assumption

(7) Conditioning (1)xt | zt ~ N(m’-BC-1(zt-a), S’-BC-1BT)

(2) Assumptionzt | xt ~ N(H xt, SW)

C-BTS’-1B = SW => C = H S’ HT + SW

H xt = a-BTS’-1(xt-m’) => a=Hm’, BT=HS’ (5) Set mean (4)=(3)

(6) Set cov. (4)=(3)

(8,9) Kalman filtermt = m’ - S’HTC-1(zt-Hm’)

(4) Conditioning (1)zt | xt ~ N(a-BTS’-1xt, C-BTS’-1B)

St = S’ - S’HTC-1HS’

Page 24: Temporal Probabilistic Models  Pt  2

PUTTING IT TOGETHER Transition matrix F, covariance Sx Observation matrix H, covariance Sz

mt+1 = F mt + Kt+1(zt+1 – HFmt)St+1 = (I - Kt+1)(FStFT + Sx)

WhereKt+1= (FStFT + Sx)HT(H(FStFT + Sx)HT +Sz)-1

Got that memorized?

Page 25: Temporal Probabilistic Models  Pt  2

PROPERTIES OF KALMAN FILTER Optimal Bayesian estimate for linear

Gaussian transition/observation models Need estimates of covariance… model

identification necessary Extensions to nonlinear

transition/observation models work as long as they aren’t too nonlinear Extended Kalman Filter Unscented Kalman Filter

Page 26: Temporal Probabilistic Models  Pt  2

Tracking the velocity of a braking obstacle

Learning that the road is slick

Actual max deceleration

Braking begins

Estimated max deceleration

Velocity initially uninformed

More distance measurements arrive

Obstacle slows

Stopping distance (95% confidence interval)

Braking initiated Gradual stop

Page 27: Temporal Probabilistic Models  Pt  2

NON-GAUSSIAN DISTRIBUTIONS Gaussian distributions are a “lump”

Kalman filter estimate

Page 28: Temporal Probabilistic Models  Pt  2

NON-GAUSSIAN DISTRIBUTIONS Integrating continuous and discrete states

Splitting with a binary choice

“up”

“down”

Page 29: Temporal Probabilistic Models  Pt  2

EXAMPLE: FAILURE DETECTION Consider a battery meter sensor

Battery = true level of battery BMeter = sensor reading

Transient failures: send garbage at time t Persistent failures: send garbage forever

Page 30: Temporal Probabilistic Models  Pt  2

EXAMPLE: FAILURE DETECTION Consider a battery meter sensor

Battery = true level of battery BMeter = sensor reading

Transient failures: send garbage at time t 5555500555…

Persistent failures: sensor is broken 5555500000…

Page 31: Temporal Probabilistic Models  Pt  2

DYNAMIC BAYESIAN NETWORK

BMetert

BatterytBatteryt-1

BMetert ~ N(Batteryt,s)

(Think of this structure “unrolled” forever…)

Page 32: Temporal Probabilistic Models  Pt  2

DYNAMIC BAYESIAN NETWORK

BMetert

BatterytBatteryt-1

BMetert ~ N(Batteryt,s)

P(BMetert=0 | Batteryt=5) = 0.03Transient failure model

Page 33: Temporal Probabilistic Models  Pt  2

RESULTS ON TRANSIENT FAILUREE

(Bat

tery

t)

Transient failure occurs

Without model

With model

Meter reads 55555005555…

Page 34: Temporal Probabilistic Models  Pt  2

RESULTS ON PERSISTENT FAILUREE

(Bat

tery

t)

Persistent failure occurs

With transient model

Meter reads 5555500000…

Page 35: Temporal Probabilistic Models  Pt  2

PERSISTENT FAILURE MODEL

BMetert

BatterytBatteryt-1

BMetert ~ N(Batteryt,s)

P(BMetert=0 | Batteryt=5) = 0.03

Brokent-1 Brokent

P(BMetert=0 | Brokent) = 1

Example of a Dynamic Bayesian Network (DBN)

Page 36: Temporal Probabilistic Models  Pt  2

RESULTS ON PERSISTENT FAILUREE

(Bat

tery

t)

Persistent failure occurs

With transient model

Meter reads 5555500000…

With persistent failure model

Page 37: Temporal Probabilistic Models  Pt  2

HOW TO PERFORM INFERENCE ON DBN? Exact inference on “unrolled” BN

Variable Elimination – eliminate old time steps After a few time steps, all variables in the state

space become dependent! Lost sparsity structure

Approximate inference Particle Filtering

Page 38: Temporal Probabilistic Models  Pt  2

PARTICLE FILTERING (AKA SEQUENTIAL MONTE CARLO)

Represent distributions as a set of particles

Applicable to non-gaussian high-D distributions

Convenient implementations

Widely used in vision, robotics

Page 39: Temporal Probabilistic Models  Pt  2

PARTICLE REPRESENTATION

Bel(xt) = {(wk,xk)} wk are weights, xk are state

hypotheses Weights sum to 1 Approximates the underlying

distribution

Page 40: Temporal Probabilistic Models  Pt  2

Weighted resampling step

PARTICLE FILTERING Represent a distribution at time t as a set of

N “particles” St1,…,St

N

Repeat for t=0,1,2,… Sample S[i] from P(Xt+1|Xt=St

i) for all i Compute weight w[i] = P(e|Xt+1=S[i]) for all i Sample St+1

i from S[.] according to weights w[.]

Page 41: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Sampling step

Page 42: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Suppose we now observe BMeter=0

P(BMeter=0|sample) = ?0.03

1

Page 43: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Compute weights (drawn as particle size)

P(BMeter=0|sample) = ?0.03

1

Page 44: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Weighted resampling

P(BMeter=0|sample) = ?

Page 45: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Sampling Step

Page 46: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Now observe BMetert = 5

Page 47: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Compute weights

10

Page 48: Temporal Probabilistic Models  Pt  2

BATTERY EXAMPLE

BMetert

BatterytBatteryt-1

Brokent-1 Brokent

Weighted resample

Page 49: Temporal Probabilistic Models  Pt  2

APPLICATIONS OF PARTICLE FILTERING IN ROBOTICS Simultaneous Localization and

Mapping (SLAM) Observations: laser rangefinder State variables: position, walls

Page 50: Temporal Probabilistic Models  Pt  2

SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM)

Mobile robots Odometry

Locally accurateDrifts significantly over

time Vision/ladar/sonar

Inaccurate locallyGlobal reference frame

Combine the twoState: (robot pose, map)Observations: (sensor

input)

Page 51: Temporal Probabilistic Models  Pt  2

GENERAL PROBLEMxt ~ Bel(xt) (arbitrary p.d.f.)xt+1 = f(xt,u,ep)zt+1 = g(xt+1,eo)ep ~ arbitrary p.d.f., eo ~ arbitrary

p.d.f.

Process noise

Observation noise

Page 52: Temporal Probabilistic Models  Pt  2

SAMPLING IMPORTANCE RESAMPLING (SIR) VARIANT

Predict

Update

Resample

Page 53: Temporal Probabilistic Models  Pt  2

ADVANCED FILTERING TOPICS Mixing exact and approximate

representations (e.g., mixture models) Multiple hypothesis tracking (assignment

problem) Model calibration Scaling up (e.g., 3D SLAM, huge maps)

Page 54: Temporal Probabilistic Models  Pt  2

NEXT TIME Putting it together: intelligent agents Read R&N 2