cse 473: artificial intelligence hidden markov models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3)...

19
1 CSE 473: Artificial Intelligence Hidden Markov Models Daniel Weld University of Washington [Many of these slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Hidden Markov Models 3

Upload: others

Post on 14-Jul-2020

37 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

1

CSE473:ArtificialIntelligence

HiddenMarkovModels

DanielWeldUniversityofWashington

[ManyoftheseslideswerecreatedbyDanKleinandPieterAbbeelforCS188IntrotoAIatUCBerkeley.AllCS188materialsare availableathttp://ai.berkeley.edu.]

HiddenMarkovModels

3

Page 2: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

2

HiddenMarkovModels

§ Definesajointprobabilitydistribution:

X5X2

E1

X1 X3 X4

E2 E3 E4 E5

XN

EN

HiddenMarkovModel:Example

§ AnHMMisdefinedby:§ Initialdistribution:§ Transitions:§ Emissions:

P(R1 )0.6

Rt-1 tf

P(Rt | Rt-1 )0.70.1

Rttf

P(Ut | Rt )0.90.2

Page 3: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

3

ConditionalIndependence

HMMshavetwoimportantindependenceproperties:§ Futureindependentofpastgiventhepresent

X2

E1

X1 X3 X4

E1 E3 E4

? ?

ConditionalIndependence

HMMshavetwoimportantindependenceproperties:§ Futureindependentofpastgiventhepresent§ Currentobservationindependentofallelsegivencurrentstate

X2

E1

X1 X3 X4

E1 E3 E4

?

?

Page 4: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

4

ConditionalIndependence

§ HMMshavetwoimportantindependenceproperties:§ Markovhiddenprocess,futuredependsonpastviathepresent§ Currentobservationindependentofallelsegivencurrentstate

§ Quiz:doesthismeanthatobservationsareindependent givennoevidence?§ [No,correlatedbythehiddenstate]

X2

E1

X1 X3 X4

E1 E3 E4

? ?

GhostbustersHMM§ P(X1)=uniform§ P(X’|X)=ghostsusuallymoveclockwise,

butsometimesmoveinarandomdirectionorstayput§ P(E|X)=samesensormodelasbefore:

red meansprobablyclose,green meanslikelyfaraway.

1/9 1/9

1/9 1/9

1/9

1/9

1/9 1/9 1/9

P(X1)

P(X’|X=<1,2>)

1/6 1/6

0 1/6

1/2

0

0 0 0

X2

E1

X1 X3 X4

E1 E3 E4

E5P(red | 3) P(orange | 3) P(yellow | 3) P(green | 3)

0.05 0.15 0.5 0.3P(E|X)

Etc… (must specify for other distances)

Etc…

Page 5: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

5

HMMComputations

§ Given§ parameters§ evidence E1:n =e1:n

§ Inference problems include:§ Filtering, find P(Xt|e1:t) for some t§ Most probable explanation, for some t find

x*1:t = argmaxx1:t P(x1:t|e1:t)§ Smoothing, find P(Xt|e1:n) for some t < n

Filtering (akaMonitoring)

§ Thetaskoftrackingtheagent’sbeliefstate,B(x),overtime§ B(x)isadistributionoverworldstates– repr agentknowledge§ WestartwithB(X)inaninitialsetting,usuallyuniform§ Astimepasses,orwegetobservations,weupdateB(X)

§ Manyalgorithmsforthis:§ Exactprobabilisticinference§ Particlefilterapproximation§ Kalman filter(amethodforhandlingcontinuousReal-valuedrandomvars)

§ inventedinthe60’forApolloProgram– real-valuedstate,Gaussiannoise

Page 6: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

6

HMMExamples

§ Robottracking:§ States(X)arepositionsonamap(continuous)§ Observations(E)arerangereadings(continuous)

X2

E1

X1 X3 X4

E1 E3 E4

Example:RobotLocalization

T=1Sensormodel:nevermorethan1mistake

Motionmodel:maynotexecuteactionwithsmallprob.

10Prob

Example from Michael Pfeiffer

Page 7: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

7

Example:RobotLocalization

t=1

10Prob

Example:RobotLocalization

t=2

10Prob

Page 8: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

8

Example:RobotLocalization

t=3

10Prob

Example:RobotLocalization

t=4

10Prob

Page 9: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

9

Example:RobotLocalization

t=5

10Prob

OtherRealHMMExamples

§ SpeechrecognitionHMMs:§ Statesarespecificpositionsinspecificwords(so,tensofthousands)§ Observationsareacousticsignals(continuousvalued)

X2

E1

X1 X3 X4

E1 E3 E4

Page 10: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

10

OtherRealHMMExamples

§ MachinetranslationHMMs:§ Statesaretranslationoptions§ Observationsarewords(tensofthousands)

X2

E1

X1 X3 X4

E1 E3 E4

Filtering (akaMonitoring)

§ Filtering,ormonitoring,isthetaskoftrackingthedistributionB(X)(called“thebeliefstate”)overtime

§ WestartwithB0(X)inaninitialsetting,usuallyuniform

§ WeupdateBt(X)computingBt+1(X)1. Astimepasses,and usingprob modelofhowghostsmove2. Aswegetobservations usingprob modelofhownoisysensorswork

Page 11: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

11

Filtering:BaseCases

E1

X1

X2X1

“Observation” “Passage of Time”

ForwardAlgorithm

§ t =0§ B(Xt)=initialdistribution§ Repeatforever

§ B’(Xt+1)=SimulatepassageoftimefromB(Xt)§ Observeet+1§ B(Xt+1)=UpdateB’(Xt+1) basedonprobabilityofet+1

23

Page 12: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

12

PassageofTime

§ AssumewehavecurrentbeliefP(X|evidencetodate)

§ Then,afteronetimesteppasses:

§ Basicidea:beliefsget“pushed” throughthetransitions§ Withthe“B” notation,wehavetobecarefulaboutwhattimesteptthebeliefisabout,andwhat

evidenceitincludes

X2X1

=X

xt

P (Xt+1, xt

|e1:t)

=X

xt

P (Xt+1|xt

, e1:t)P (xt

|e1:t)

=X

xt

P (Xt+1|xt

)P (xt

|e1:t)

§ Orcompactly:

B

0(Xt+1) =

X

xt

P (X 0|xt

)B(xt

)

P (Xt+1|e1:t)

Example:PassageofTime

§ Astimepasses,uncertainty“accumulates”

T=1 T=2 T=5

(Transitionmodel:ghostsusuallygoclockwise)

Page 13: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

13

Observation§ AssumewehavecurrentbeliefP(X|previousevidence):

§ Then,afterevidencecomesin:

§ Or,compactly:

E1

X1

B0(Xt+1) = P (Xt+1|e1:t)

P (Xt+1|e1:t+1) = P (Xt+1, et+1|e1:t)/P (et+1|e1:t)/Xt+1 P (Xt+1, et+1|e1:t)

= P (et+1|Xt+1)P (Xt+1|e1:t)

= P (et+1|e1:t, Xt+1)P (Xt+1|e1:t)

B(Xt+1) /Xt+1 P (et+1|Xt+1)B0(Xt+1)

§ Basicidea:beliefs“reweighted”bylikelihoodofevidence

§ Unlikepassageoftime,wehavetorenormalize

Example:Observation

§ Aswegetobservations,beliefsgetreweighted,uncertainty“decreases”

Beforeobservation Afterobservation

Page 14: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

14

Example:WeatherHMM

Rt Rt+1 P(Rt+1|Rt)

+r +r 0.7+r -r 0.3-r +r 0.3-r -r 0.7

Rt Ut P(Ut|Rt)

+r +u 0.9+r -u 0.1-r +u 0.2-r -u 0.8

Umbrella1=T Umbrella2=T

Rain0 Rain1 Rain2

B(+r)=0.5B(-r)=0.5

B’(+r)=0.5B’(-r)=0.5

B(+r)=0.818B(-r)=0.182

B’(+r)=0.627B’(-r)=0.373

B(+r)=0.883B(-r)=0.117

VideoofDemoPacman – Sonar(withbeliefs)

Page 15: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

15

Summary:OnlineBeliefUpdates

Everytimestep,westartwithcurrentP(X|evidence)1.Weupdatefortime:

2.Weupdateforevidence:

Theforwardalgorithmdoesbothatonce(anddoesn’tnormalize)Computationalcomplexity?

X2X1

X2

E2

O(X2+XE)time&O(X+E)space

ParticleFiltering

Page 16: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

16

ParticleFilteringOverview

§ Approximationtechniquetosolvefilteringproblem§ RepresentsPdistributionwithsamples§ Filteringstilloperatesintwosteps

§ Elapsetime§ Incorporateobservations

§ (Butthisparthastwosub-steps:weight&resample)

35

ParticleFiltering

§ Sometimes|X|istoobigtouseexactinference§ |X|maybetoobigtoevenstoreB(X)§ E.g.Xiscontinuous

§ Solution:approximateinference§ TracksamplesofX,notexactdistributionofvalues§ Samplesarecalledparticles§ Timeperstepislinearinthenumberofsamples§ But:numberneededmaybelarge§ Inmemory:listofparticles,notstates

§ Particleisjustnewnameforsample

§ Thisishowrobotlocalizationworksinpractice

Page 17: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

17

Remember…

AnHMMisdefinedby:§ Initialdistribution:§ Transitions:§ Emissions:

Here’saSingleParticle§ Itrepresentsahypotheticalstatewheretherobotisin(1,2)

Page 18: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

18

Particles ApproximateDistribution§ OurrepresentationofP(X)isnowalistofNparticles(samples)

§ Generally,N<<|X|

Particles: (3,3)(2,3)(3,3)(3,2)(3,3)(3,2)(1,2)(3,3)(3,3)(2,3)

P(x)Distribution

P(x=<3,3>) = 5/10 = 50%

ParticleFiltering

Amorecompactviewoverlays thesamples:

Encodes à

0.0 0.2

0.1 0.0

0.5

0.2

0.0 0.2 0.5

Page 19: CSE 473: Artificial Intelligence Hidden Markov Models · (2,3) (3,3) (3,2) (3,3) (3,2) (1,2) (3,3) (3,3) (2,3) P(x) Distribution P(x=) = 5/10 = 50% Particle Filtering A

19

Representation:Particles§ OurrepresentationofP(X)isnowalist ofNparticles(samples)

§ Generally,N<<|X|§ Storingmap fromXtocountswoulddefeatthepurpose

§ P(x)approximatedby(numberofparticleswithvaluex)/N§ Moreparticles,moreaccuracy

§ WhatisP((3,3))?

Particles: (3,3)(2,3)(3,3)(3,2)(3,3)(3,2)(1,2)(3,3)(3,3)(2,3)

5/10 = 50%

Representation:Particles§ OurrepresentationofP(X)isnowalistofNparticles(samples)

§ Generally,N<<|X|§ StoringmapfromXtocountswoulddefeatthepurpose

§ P(x)approximatedby(numberofparticleswithvaluex)/N§ Moreparticles,moreaccuracy

§ WhatisP((2,2))?

§ Infact,manyxmayhaveP(x)=0!

Particles: (3,3)(2,3)(3,3)(3,2)(3,3)(3,2)(1,2)(3,3)(3,3)(2,3)

0/10 = 0%