particle filters outline 1.introduction to particle filters 1.recursive bayesian estimation...
TRANSCRIPT
![Page 1: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/1.jpg)
Particle Particle FiltersFilters
![Page 2: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/2.jpg)
Outline1. Introduction to particle filters
1. Recursive Bayesian estimation
2. Bayesian Importance sampling1. Sequential Importance sampling (SIS)2. Sampling Importance resampling (SIRSIR)
3. Improvements to SIRSIR1. On-line Markov chain Monte Carlo
4. Basic Particle Filter algorithm5. Example for robot localization6. Conclusions
![Page 3: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/3.jpg)
•But what if not a gaussian distribution in our problem?
![Page 4: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/4.jpg)
Motivation for particle filters
![Page 5: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/5.jpg)
Key Idea of Particle Filters
• Idea = we try to have more samples where we expect to have the solution
![Page 6: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/6.jpg)
Motion Model Reminder
• Density of samples represents the expected probability of robot location
![Page 7: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/7.jpg)
![Page 8: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/8.jpg)
Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif
•This is the lost robot problem
![Page 9: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/9.jpg)
![Page 10: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/10.jpg)
Particles are used for Particles are used for probability probability density function Approximationdensity function Approximation
![Page 11: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/11.jpg)
Particle sets can be used to approximate functions
Function Approximation
The more particles fall into an interval, the The more particles fall into an interval, the higher the probability of that intervalhigher the probability of that interval
How to draw samples from a How to draw samples from a function/distribution?function/distribution?
![Page 12: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/12.jpg)
Importance Sampling PrincipleImportance Sampling Principle
![Page 13: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/13.jpg)
w = f / g
f is often calledtarget
g is often calledproposal
Pre-condition: f(x)>0 g(x)>0
Importance Sampling Principle
weight
![Page 14: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/14.jpg)
Importance samplingImportance sampling: another example of calculating weight samplesweight samples
• How to calculate formally the f/g value?
![Page 15: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/15.jpg)
Importance Sampling Formulas Importance Sampling Formulas for for f, g f, g and and f/gf/g
),...,,(
)()|(),...,,|( :fon distributiTarget
2121
n
kk
n zzzp
xpxzpzzzxp
)(
)()|()|( :gon distributi Sampling
l
ll zp
xpxzpzxp
),...,,(
)|()(
)|(
),...,,|( : w weightsImportance
21
21
n
lkkl
l
n
zzzp
xzpzp
zxp
zzzxp
g
f
f
g
f/g
![Page 16: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/16.jpg)
History of Monte Carlo Idea History of Monte Carlo Idea and especially Particle Filtersand especially Particle Filters
• First attempts – simulations of growing polymers– M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,”
Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956.
• First application in signal processing - 1993– N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state
estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993.
• Books– A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001.– B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech
House Publishers, 2004.
• Tutorials– M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-
gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.
![Page 17: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/17.jpg)
What is the problem that we What is the problem that we want to solve?want to solve?
•The problem is tracking the state of a system as it evolves over time
•Sequentially arriving (noisy or ambiguous) observations
•We want to know: Best possible estimate of the hidden variables
![Page 18: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/18.jpg)
Solution: Sequential Update
• Storing and processing all incoming measurements is inconvenient and may be impossible
• Recursive filteringRecursive filtering:1. Predict next state pdf state pdf from current estimatecurrent estimate
2. Update the prediction using sequentially arriving new measurements
• Optimal Bayesian solutionOptimal Bayesian solution: • recursively calculating exact posterior density
These lead to various particle filters
![Page 19: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/19.jpg)
Particle Filters
1. Sequential Monte Carlo methods for on-line learning within a Bayesian framework.
2. Known as1. Particle filters2. Sequential sampling-importance resampling (SIR)3. Bootstrap filters4. Condensation trackers5. Interacting particle approximations6. Survival of the fittest
![Page 20: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/20.jpg)
Particle Filter characteristicsParticle Filter characteristics
![Page 21: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/21.jpg)
Approaches to Particle Filters
METAPHORS
![Page 22: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/22.jpg)
Particle filters
• Sequential andand Monte Carlo properties• Representing belief by sets of samples or
particles
• are nonnegative weights called importance factors
• Updating procedure is sequential importance sampling with re-sampling
( ) ~ { , | 1,..., }i it t t tBel x S x w i n
itw
![Page 23: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/23.jpg)
Tracking in 1D:Tracking in 1D: the blue trajectory is the target.The best of10 particles is in red.
![Page 24: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/24.jpg)
Short, more formal, Introduction to Particle
Filters and Monte Carlo
Localization
![Page 25: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/25.jpg)
Proximity Sensor Model Reminder
![Page 26: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/26.jpg)
Particle filtering ideas• Recursive Bayesian filter by Monte Carlo sampling• The ideaThe idea: represent the posterior density by a set of random
particles with associated weights. • Compute estimates Compute estimates based on these samples and weights
•Sample space
•Posterior density
![Page 27: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/27.jpg)
Particle filtering ideas
•Sample space
•Posterior density
1. Particle filters are based on recursive generation of random measures that approximate the distributions of the unknowns.
2.2. RRandom measuresandom measures: : particles and importance weights. 3. As new observations become available, the particles and the
weights are propagatedpropagated by exploiting Bayes theoremBayes theorem.
![Page 28: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/28.jpg)
Mathematical tools needed for Particle Filters
1 1 1( ) ( | ) ( )t t t t tp x p x x p x dx ( | ) ( )
( | )( )
t t tt t
t
p z x p xp x z
p z
•Recall “law of total probability” and “Bayes’ rule”
![Page 29: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/29.jpg)
Recursive Bayesian Recursive Bayesian estimation (I)estimation (I)
• Recursive filter:– System model:
– Measurement model:
– Information available:
)|( ),( 11 kkkkkk xxpxfx
)|( ),( kkkkkk xypxhy
),,( 1 kk yyD
)( 0xp
![Page 30: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/30.jpg)
Recursive Bayesian estimation (II)• Seek:
– i = 0: filtering.– i > 0: prediction.– i<0: smoothing.
• Prediction:Prediction:
– since:
)|( kik Dxp
1111 )|,()|( kkkkkk dxDxxpDxp
11111 )|()|()|( kkkkkkk dxDxpxxpDxp
)|()|()|(),|()|,( 111111111 kkkkkkkkkkkk DxpxxpDxpDxxpDxxp
![Page 31: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/31.jpg)
Recursive Bayesian estimation (III)
• Update:Update:
• where:
– since:
kkkkkk dxDxypDyp )|,()|( 11
kkkkkkk dxDxpxypDyp )|()|()|( 11
)|(
)|()|()|(
1
1
kk
kkkkkk Dyp
DxpxypDxp
)|()|()|(),|()|,( 1111 kkkkkkkkkkkk DxpxypDxpDxypDxyp
![Page 32: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/32.jpg)
Bayes Filters (second Bayes Filters (second pass)pass)
1( , )
( , )t t t t
t t t t
x f x w
z g x v
•System state dynamics
•Observation dynamics
1( ) ( | , , )t t tBel x p x z z
•We are interested in: Belief or posterior density
•Estimating system state from noisy observations
![Page 33: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/33.jpg)
1:( 1) 1 1where , ,t tz z z
1:( 1) 1, 1:( 1) 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t t tp x z p x x z p x z dx
•From above, constructing two steps of Bayes Filters
1:( 1)1:( 1) 1:( 1)
1:( 1)
( | , )( | , ) ( | )
( | )t t t
t t t t tt t
p z x zp x z z p x z
p z z
•Predict:
•Update:
![Page 34: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/34.jpg)
1:( 1) 1, 1:( 1) 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t t tp x z p x x z p x z dx
1:( 1)replace ( | , ) with ( | )t t t t tp z x z p z x
•Predict:
•Update:
•Assumptions: Markov Process
1 1: 1 1replace ( | , ) with ( | )t t t t tp x x z p x x
1:( 1)1:( 1) 1:( 1)
1:( 1)
( | , )( | , ) ( | )
( | )t t t
t t t t tt t
p z x zp x z z p x z
p z z
![Page 35: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/35.jpg)
1:( 1) 1:( 1)( | , ) ( | ) ( | )t t t t t t t tp x z z p z x p x z
•Bayes Filter
1:( 1) 1 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t tp x z p x x p x z dx
1( | )
( | )t t
t t
p x x
p z x
•How to use it? What else to know?
•Motion Model
•Perceptual Model
•Start from: 0 00 0 0
0
( | )( | ) ( )
( )
p z xp x z p x
p z
![Page 36: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/36.jpg)
Particle Filters: Compare Gaussian and Particle Filters
![Page 37: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/37.jpg)
Example 1Example 1: :
theoretical PDFtheoretical PDF
![Page 38: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/38.jpg)
• Example 1: theoretical PDF
10 0( ) or ( )Bel x p x
•Step 0: initialization
0 0 0
0 0 0 0
( ) or ( | )
( | ) ( )
Bel x p x z
p z x p x
•Step 1: updating
![Page 39: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/39.jpg)
Example 2: Particle Filter
•Step 0: initialization
•Each particle has the same weight
•Step 1: updating weights. Weights are proportional to p(z|x)
![Page 40: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/40.jpg)
•Example 1 (continue)
1 1 1
1 1 1 0 0
( ) or ( | )
( | ) ( | )
Bel x p x z
p z x p x z
•Step 3: updating
12 2 1
2 1 1 1 1
( ) or ( | )
( | ) ( | )
Bel x p x z
p x x p x z dx
•Step 4: predicting
11 1 0
1 0 0 0 0
( ) or ( | )
( | ) ( | )
Bel x p x z
p x x p x z dx
•Step 2: predicting
•1
![Page 41: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/41.jpg)
Robot Motion
![Page 42: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/42.jpg)
Example 2: Example 2: Particle Particle FilterFilter
![Page 43: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/43.jpg)
Example 2: Particle Filter
•Particles are more concentrated in the region where the person is more likely to be
•Step 3: updating weights. Weights are proportional to p(z|x)
•Step 4: predicting.
•Predict the new locations of particles.
•Step 2: predicting.
•Predict the new locations of particles.
![Page 44: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/44.jpg)
Robot Motion
![Page 45: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/45.jpg)
Compare Particle Filter with Bayes Filter with Known Distribution
•Example 1
•Example 2
•Example 1
•Example 2
•Predicting
•Updating
![Page 46: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/46.jpg)
Classical approximations
• Analytical methods: – Extended Kalman filter,– Gaussian sums… (Alspach et al. 1971)
• Perform poorly in numerous cases of interest
• Numerical methods:– point masses approximations,– splines. (Bucy 1971, de Figueiro 1974…)
• Very complex to implement, not flexible.
![Page 47: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/47.jpg)
Monte Carlo Monte Carlo LocalizationLocalization
![Page 48: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/48.jpg)
Mobile Robot Localization
Each particleparticle is a potential pose of the robot
Proposal distribution is the motion model of the robot (prediction step)
The observation model is used to compute the importance weight importance weight (correction step)
![Page 49: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/49.jpg)
Monte Carlo Localization Each particleparticle is a potential pose of the robot
Proposal distribution is the motion model of the robot (prediction step)
The observation model is used to compute the importance weight importance weight (correction step)
![Page 50: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/50.jpg)
Sample-based Localization (sonar)
![Page 51: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/51.jpg)
Random samples and the pdf (I)Random samples and the pdf (I)• Take p(x)=Gamma(4,1)Gamma(4,1)• Generate some random samples• Plot histogram and basic approximation to pdf
0 2 4 6 8 10 12 14 16 18 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0 20 40 60 80 100 120 140 160 180 2000
2
4
6
8
10
12
•200 samples
![Page 52: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/52.jpg)
Random samples and the pdf (II)
0 2 4 6 8 10 12 14 16 18 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0 2 4 6 8 10 12 14 16 18 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
•500 samples
•1000 samples
![Page 53: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/53.jpg)
Random samples and the pdf (III)Random samples and the pdf (III)
0 5 10 15 20 250
0.05
0.1
0.15
0.2
0.25
0 5 10 15 20 250
0.05
0.1
0.15
0.2
0.25
•200000 samples•5000 samples
![Page 54: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/54.jpg)
Importance Importance SamplingSampling
![Page 55: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/55.jpg)
Importance SamplingImportance Sampling• Unfortunately it is often not possible to sample directly from the posterior
distribution, but we can use importance sampling.
• Let p(x) be a pdf from which it is difficult to draw samples.
• Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density.
• Then approximation to the density p is given by
)(
)(i
ii
xq
xpw
)()(1
in
i
i xxwxp
•where
![Page 56: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/56.jpg)
Bayesian Importance SamplingBayesian Importance Sampling
• By drawing samples from a known easy to sample proposal distribution we obtain:
N
i
ikk
ikkk xxwDxp
1:0:0:0 )()|(
)|( :0 kk Dxq
ikx :0
)|(
)|(
:0
:0
ki
k
ki
kik Dxq
Dxpw
•where
•are normalized weights.
![Page 57: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/57.jpg)
Sensor Information: Importance Sampling
![Page 58: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/58.jpg)
Sequential Importance Sampling (I)Sequential Importance Sampling (I)• Factorizing the proposal distribution:
• and remembering that the state evolution is modeled as a Markov process
• we obtain a recursive estimate of the importance weights:
• Factorizing is obtained by recursively applying
k
jjjjkk DxxqxqDxq
11:00:0 ),|()()|(
),|(
)|()|(
1:0
11
kkk
kkkkkk Dxxq
xxpxypww
)|(),|()|( 11:01:0:0 kkkkkkk DxqDxxqDxq
![Page 59: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/59.jpg)
•Sequential Importance Sampling (SIS) Particle Filter
•SIS Particle Filter Algorithm
],},[{]},[{ 1111 kNi
ik
ik
Ni
ik
ik zwxSISwx
•for i=1:N
•Draw a particle
•Assign a weight
•end
),|(~ 1 kik
ik
ik zxxqx
),|(
)|()|(
1:0
11
ki
kik
ik
ik
ikki
kik Dxxq
xxpxzpww
•(k is index over time and i is the particle index)
![Page 60: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/60.jpg)
Rejection Rejection SamplingSampling
![Page 61: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/61.jpg)
Let us assume that f(x)<1 for all x
Sample x from a uniform distribution
Sample c from [0,1]
if f(x) > c keep the sampleotherwise reject the sample
Rejection SamplingRejection Sampling
•c
•x
•f(x)
•c’
•x’
•f(x’)
•OK
![Page 62: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/62.jpg)
Importance Sampling with Importance Sampling with Resampling:Resampling:
Landmark Detection ExampleLandmark Detection Example
![Page 63: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/63.jpg)
Distributions
![Page 64: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/64.jpg)
Distributions
•Wanted: samples distributed according to p(x| z1, z2, z3)
![Page 65: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/65.jpg)
This is Easy!•We can draw samples from p(x|zl) by adding noise to the detection parameters.
![Page 66: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/66.jpg)
Importance sampling with Resampling
•After After ResamplingResampling
![Page 67: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/67.jpg)
Particle Particle Filter Filter
AlgorithmAlgorithm
![Page 68: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/68.jpg)
weight =
target distribution / proposal distribution
![Page 69: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/69.jpg)
•draw xit1 from Bel(xt1)
•draw xit from p(xt | xi
t1,ut1)
•Importance factor for xit:
)|(
)(),|(
)(),|()|(
ondistributi proposal
ondistributitarget
111
111
tt
tttt
tttttt
it
xzp
xBeluxxp
xBeluxxpxzp
w
1111 )(),|()|()( tttttttt dxxBeluxxpxzpxBel
Particle Filter Algorithm
![Page 70: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/70.jpg)
Particle Filter Algorithm
![Page 71: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/71.jpg)
1. Algorithm particle_filter( St-1, ut-1 zt):
2.
3. For Generate new samples
4. Sample index j(i) from the discrete distribution given by wt-
1
5. Sample from using and
6. Compute importance weight
7. Update normalization factor
8. Insert
9. For
10. Normalize weights
Particle Filter Algorithm
0, tS
ni 1
},{ it
ittt wxSS
itw
itx ),|( 11 ttt uxxp )(
1ij
tx 1tu
)|( itt
it xzpw
ni 1/i
tit ww
![Page 72: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/72.jpg)
Particle Filter for LocalizationParticle Filter for Localization
![Page 73: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/73.jpg)
Particle Filter in Particle Filter in Matlab Matlab
![Page 74: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/74.jpg)
•Matlab code: truex is a vector of 100 positions to be tracked.Matlab code: truex is a vector of 100 positions to be tracked.
![Page 75: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/75.jpg)
Application: Particle Filter for Localization Application: Particle Filter for Localization (Known Map)(Known Map)
![Page 76: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/76.jpg)
Sources• Longin Jan Latecki • Keith Copsey • Paul E. Rybski• Cyrill Stachniss • Sebastian Thrun • Alex Teichman• Michael Pfeiffer• J. Hightower• L. Liao• D. Schulz• G. Borriello• Honggang Zhang• Wolfram Burgard• Dieter Fox
•76
• Giorgio Grisetti• Maren Bennewitz• Christian Plagemann • Dirk Haehnel• Mike Montemerlo• Nick Roy• Kai Arras• Patrick Pfaff• Miodrag Bolic• Haris Baltzakis
![Page 77: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/77.jpg)
•77
![Page 78: Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation 2.Bayesian Importance sampling 1.Sequential Importance sampling](https://reader037.vdocuments.us/reader037/viewer/2022103022/56649d345503460f94a0b1ef/html5/thumbnails/78.jpg)
Perfect Monte Carlo simulationPerfect Monte Carlo simulation• Recall that
• Random samples are drawn from the posterior distribution.
• Represent posterior distribution using a set of samples or particles.
• Easy to approximate expectations of the form:
– by:
),,( 0:0 kk xxx
kkkkk dxDxpxgxgE :0:0:0:0 )|()())((
N
i
ikk xg
NxgE
1:0:0 )(
1))((
ikx :0
N
i
ikkkk xx
NDxp
1:0:0:0 )(
1)|(