iccv2009: map inference in discrete models: part 3
DESCRIPTION
TRANSCRIPT
![Page 1: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/1.jpg)
MAP Inference in Discrete Models
M. Pawan Kumar, Stanford University
![Page 2: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/2.jpg)
The Problem
E(x) = ∑ fi (xi) + ∑ gij (xi,xj) + ∑ hc(xc) i ij c
Unary Pairwise Higher Order
Minimize E(x) ….. Done !!!
Problems worthy of attackProve their worth by fighting back
![Page 3: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/3.jpg)
Energy Minimization Problems
Space of Problems
CSP
MAXCUTNP-Hard
NP-hard
![Page 4: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/4.jpg)
The Issues
• Which functions are exactly solvable?Boros Hammer [1965], Kolmogorov Zabih [ECCV 2002, PAMI 2004] , Ishikawa [PAMI
2003], Schlesinger [EMMCVPR 2007], Kohli Kumar Torr [CVPR2007, PAMI 2008] ,
Ramalingam Kohli Alahari Torr [CVPR 2008] , Kohli Ladicky Torr [CVPR 2008, IJCV
2009] , Zivny Jeavons [CP 2008]
• Approximate solutions of NP-hard problemsSchlesinger [76 ], Kleinberg and Tardos [FOCS 99], Chekuri et al. [01], Boykov et al.
[PAMI 01], Wainwright et al. [NIPS01], Werner [PAMI 2007], Komodakis et al. [PAMI, 05
07], Lempitsky et al. [ICCV 2007], Kumar et al. [NIPS 2007], Kumar et al. [ICML 2008],
Sontag and Jakkola [NIPS 2007], Kohli et al. [ICML 2008], Kohli et al. [CVPR 2008,
IJCV 2009], Rother et al. [2009]
• Scalability and Efficiency Kohli Torr [ICCV 2005, PAMI 2007], Juan and Boykov [CVPR 2006], Alahari Kohli Torr
[CVPR 2008] , Delong and Boykov [CVPR 2008]
![Page 5: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/5.jpg)
The Issues
• Which functions are exactly solvable?Boros Hammer [1965], Kolmogorov Zabih [ECCV 2002, PAMI 2004] , Ishikawa [PAMI
2003], Schlesinger [EMMCVPR 2007], Kohli Kumar Torr [CVPR2007, PAMI 2008] ,
Ramalingam Kohli Alahari Torr [CVPR 2008] , Kohli Ladicky Torr [CVPR 2008, IJCV
2009] , Zivny Jeavons [CP 2008]
• Approximate solutions of NP-hard problemsSchlesinger [76 ], Kleinberg and Tardos [FOCS 99], Chekuri et al. [01], Boykov et al.
[PAMI 01], Wainwright et al. [NIPS01], Werner [PAMI 2007], Komodakis et al. [PAMI, 05
07], Lempitsky et al. [ICCV 2007], Kumar et al. [NIPS 2007], Kumar et al. [ICML 2008],
Sontag and Jakkola [NIPS 2007], Kohli et al. [ICML 2008], Kohli et al. [CVPR 2008,
IJCV 2009], Rother et al. [2009]
• Scalability and Efficiency Kohli Torr [ICCV 2005, PAMI 2007], Juan and Boykov [CVPR 2006], Alahari Kohli Torr
[CVPR 2008] , Delong and Boykov [CVPR 2008]
![Page 6: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/6.jpg)
Popular Inference Methods
Iterated Conditional Modes (ICM)
Simulated Annealing
Dynamic Programming (DP)
Belief Propagtion (BP)
Tree-Reweighted (TRW), Diffusion
Graph Cut (GC)
Branch & Bound
Relaxation methods:
…
Classical Move making algorithms
Combinatorial Algorithms
Message passing
Convex Optimization(Linear Programming,
...)
![Page 7: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/7.jpg)
Road Map9.30-10.00 Introduction (Andrew Blake)
10.00-11.00 Discrete Models in Computer Vision (Carsten Rother)
15min Coffee break
11.15-12.30 Message Passing: DP, TRW, LP relaxation (Pawan
Kumar)
12.30-13.00 Quadratic pseudo-boolean optimization (Pushmeet
Kohli)
1hour Lunch break
14:00-15.00 Transformation and move-making methods (Pushmeet
Kohli)
15:00-15.30 Speed and Efficiency (Pushmeet Kohli)
15min Coffee break
15:45-16.15 Comparison of Methods (Carsten Rother)
16:30-17.30 Recent Advances: Dual-decomposition, higher-order,
etc. (Carsten Rother + Pawan Kumar)
![Page 8: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/8.jpg)
Notation
Labels - li, lj, ….
Labeling - f : {a,b,..} {i,j, …}
Random variables - Va, Vb, ….
Unary Potential - a;f(a)
Pairwise Potential - ab;f(a)f(b)
![Page 9: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/9.jpg)
Notation
MAP Estimation
f* = arg min Q(f; )
Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)
Min-marginals
qa;i = min Q(f; ) s.t. f(a) = i
Energy Function
![Page 10: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/10.jpg)
Outline
• Reparameterization
• Belief Propagation
• Tree-reweighted Message Passing
![Page 11: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/11.jpg)
Reparameterization
Va Vb
2
5
4
2
0
1 1
0
f(a) f(b) Q(f; )
0 0 7
0 1 10
1 0 5
1 1 6
2 +
2 +
- 2
- 2
Add a constant to all a;i
Subtract that constant from all b;k
![Page 12: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/12.jpg)
Reparameterization
f(a) f(b) Q(f; )
0 0 7 + 2 - 2
0 1 10 + 2 - 2
1 0 5 + 2 - 2
1 1 6 + 2 - 2
Add a constant to all a;i
Subtract that constant from all b;k
Q(f; ’) = Q(f; )
Va Vb
2
5
4
2
0
0
2 +
2 +
- 2
- 2
1 1
![Page 13: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/13.jpg)
Reparameterization
Va Vb
2
5
4
2
0
1 1
0
f(a) f(b) Q(f; )
0 0 7
0 1 10
1 0 5
1 1 6
- 3 + 3
Add a constant to one b;k
Subtract that constant from ab;ik for all ‘i’
- 3
![Page 14: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/14.jpg)
Reparameterization
Va Vb
2
5
4
2
0
1 1
0
f(a) f(b) Q(f; )
0 0 7
0 1 10 - 3 + 3
1 0 5
1 1 6 - 3 + 3
- 3 + 3
- 3
Q(f; ’) = Q(f; )
Add a constant to one b;k
Subtract that constant from ab;ik for all ‘i’
![Page 15: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/15.jpg)
Reparameterization
Va Vb
2
5
4
2
3 1
0
1
2
Va Vb
2
5
4
2
3 1
1
0
1
- 2
- 2
- 2 + 2+ 1
+ 1
+ 1
- 1
Va Vb
2
5
4
2
3 1
2
1
0 - 4 + 4
- 4
- 4
’a;i = a;i ’b;k = b;k
’ab;ik = ab;ik
+ Mab;k
- Mab;k
+ Mba;i
- Mba;i
Q(f; ’)
= Q(f; )
![Page 16: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/16.jpg)
Reparameterization
Q(f; ’) = Q(f; ), for all f
’ is a reparameterization of , iff
’
’b;k = b;k
’a;i = a;i
’ab;ik = ab;ik
+ Mab;k
- Mab;k
+ Mba;i
- Mba;i
Equivalently Kolmogorov, PAMI, 2006
Va Vb
2
5
4
2
0
0
2 +
2 +
- 2
- 2
1 1
![Page 17: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/17.jpg)
Recap
MAP Estimation
f* = arg min Q(f; )
Q(f; ) = ∑a a;f(a) + ∑(a,b) ab;f(a)f(b)
Min-marginals
qa;i = min Q(f; ) s.t. f(a) = i
Q(f; ’) = Q(f; ), for all f ’
Reparameterization
![Page 18: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/18.jpg)
Outline
• Reparameterization
• Belief Propagation
– Exact MAP for Chains and Trees
– Approximate MAP for general graphs
– Computational Issues and Theoretical Properties
• Tree-reweighted Message Passing
![Page 19: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/19.jpg)
Belief Propagation
• Belief Propagation gives exact MAP for chains
• Some MAP problems are easy
• Exact MAP for trees
• Clever Reparameterization
![Page 20: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/20.jpg)
Two Variables
Va Vb
2
5 2
1
0
Va Vb
2
5
40
1
Choose the right constant ’b;k = qb;k
Add a constant to one b;k
Subtract that constant from ab;ik for all ‘i’
![Page 21: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/21.jpg)
Va Vb
2
5 2
1
0
Va Vb
2
5
40
1
Choose the right constant ’b;k = qb;k
a;0 + ab;00 = 5 + 0
a;1 + ab;10 = 2 + 1minMab;0 =
Two Variables
![Page 22: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/22.jpg)
Va Vb
2
5 5
-2
-3
Va Vb
2
5
40
1
Choose the right constant ’b;k = qb;k
Two Variables
![Page 23: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/23.jpg)
Va Vb
2
5 5
-2
-3
Va Vb
2
5
40
1
Choose the right constant ’b;k = qb;k
f(a) = 1
’b;0 = qb;0
Two Variables
Potentials along the red path add up to 0
![Page 24: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/24.jpg)
Va Vb
2
5 5
-2
-3
Va Vb
2
5
40
1
Choose the right constant ’b;k = qb;k
a;0 + ab;01 = 5 + 1
a;1 + ab;11 = 2 + 0minMab;1 =
Two Variables
![Page 25: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/25.jpg)
Va Vb
2
5 5
-2
-3
Va Vb
2
5
6-2
-1
Choose the right constant ’b;k = qb;k
f(a) = 1
’b;0 = qb;0
f(a) = 1
’b;1 = qb;1
Minimum of min-marginals = MAP estimate
Two Variables
![Page 26: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/26.jpg)
Va Vb
2
5 5
-2
-3
Va Vb
2
5
6-2
-1
Choose the right constant ’b;k = qb;k
f(a) = 1
’b;0 = qb;0
f(a) = 1
’b;1 = qb;1
f*(b) = 0 f*(a) = 1
Two Variables
![Page 27: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/27.jpg)
Va Vb
2
5 5
-2
-3
Va Vb
2
5
6-2
-1
Choose the right constant ’b;k = qb;k
f(a) = 1
’b;0 = qb;0
f(a) = 1
’b;1 = qb;1
We get all the min-marginals of Vb
Two Variables
![Page 28: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/28.jpg)
Recap
We only need to know two sets of equations
General form of Reparameterization
’a;i = a;i
’ab;ik = ab;ik
+ Mab;k
- Mab;k
+ Mba;i
- Mba;i
’b;k = b;k
Reparameterization of (a,b) in Belief Propagation
Mab;k = mini { a;i + ab;ik }
Mba;i = 0
![Page 29: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/29.jpg)
Three Variables
Va Vb
2
5 2
1
0
Vc
4 60
1
0
1
3
2 3
Reparameterize the edge (a,b) as before
l0
l1
![Page 30: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/30.jpg)
Va Vb
2
5 5-3
Vc
6 60
1
-2
3
Reparameterize the edge (a,b) as before
f(a) = 1
f(a) = 1
-2 -1 2 3
Three Variables
l0
l1
![Page 31: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/31.jpg)
Va Vb
2
5 5-3
Vc
6 60
1
-2
3
Reparameterize the edge (a,b) as before
f(a) = 1
f(a) = 1
Potentials along the red path add up to 0
-2 -1 2 3
Three Variables
l0
l1
![Page 32: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/32.jpg)
Va Vb
2
5 5-3
Vc
6 60
1
-2
3
Reparameterize the edge (b,c) as before
f(a) = 1
f(a) = 1
Potentials along the red path add up to 0
-2 -1 2 3
Three Variables
l0
l1
![Page 33: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/33.jpg)
Va Vb
2
5 5-3
Vc
6 12-6
-5
-2
9
Reparameterize the edge (b,c) as before
f(a) = 1
f(a) = 1
Potentials along the red path add up to 0
f(b) = 1
f(b) = 0
-2 -1 -4 -3
Three Variables
l0
l1
![Page 34: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/34.jpg)
Va Vb
2
5 5-3
Vc
6 12-6
-5
-2
9
Reparameterize the edge (b,c) as before
f(a) = 1
f(a) = 1
Potentials along the red path add up to 0
f(b) = 1
f(b) = 0
qc;0
qc;1-2 -1 -4 -3
Three Variables
l0
l1
![Page 35: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/35.jpg)
Va Vb
2
5 5-3
Vc
6 12-6
-5
-2
9
f(a) = 1
f(a) = 1
f(b) = 1
f(b) = 0
qc;0
qc;1
f*(c) = 0 f*(b) = 0 f*(a) = 1
Generalizes to any length chain
-2 -1 -4 -3
Three Variables
l0
l1
![Page 36: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/36.jpg)
Va Vb
2
5 5-3
Vc
6 12-6
-5
-2
9
f(a) = 1
f(a) = 1
f(b) = 1
f(b) = 0
qc;0
qc;1
f*(c) = 0 f*(b) = 0 f*(a) = 1
Only Dynamic Programming
-2 -1 -4 -3
Three Variables
l0
l1
![Page 37: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/37.jpg)
Why Dynamic Programming?
3 variables 2 variables + book-keeping
n variables (n-1) variables + book-keeping
Start from left, go to right
Reparameterize current edge (a,b)
Mab;k = mini { a;i + ab;ik }
’ab;ik = ab;ik+ Mab;k - Mab;k’b;k = b;k
Repeat
![Page 38: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/38.jpg)
Why Dynamic Programming?
Start from left, go to right
Reparameterize current edge (a,b)
Mab;k = mini { a;i + ab;ik }
’ab;ik = ab;ik+ Mab;k - Mab;k’b;k = b;k
Repeat
Messages Message Passing
Why stop at dynamic programming?
![Page 39: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/39.jpg)
Va Vb
2
5 5-3
Vc
6 12-6
-5
-2
9
Reparameterize the edge (c,b) as before
-2 -1 -4 -3
Three Variables
l0
l1
![Page 40: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/40.jpg)
Va Vb
2
5 9-3
Vc
11 12-11
-9
-2
9
Reparameterize the edge (c,b) as before
-2 -1 -9 -7
’b;i = qb;i
Three Variables
l0
l1
![Page 41: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/41.jpg)
Va Vb
2
5 9-3
Vc
11 12-11
-9
-2
9
Reparameterize the edge (b,a) as before
-2 -1 -9 -7
Three Variables
l0
l1
![Page 42: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/42.jpg)
Va Vb
9
11 9-9
Vc
11 12-11
-9
-9
9
Reparameterize the edge (b,a) as before
-9 -7 -9 -7
’a;i = qa;i
Three Variables
l0
l1
![Page 43: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/43.jpg)
Va Vb
9
11 9-9
Vc
11 12-11
-9
-9
9
Forward Pass Backward Pass
-9 -7 -9 -7
All min-marginals are computed
Three Variables
l0
l1
![Page 44: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/44.jpg)
Belief Propagation on Chains
Start from left, go to right
Reparameterize current edge (a,b)
Mab;k = mini { a;i + ab;ik }
’ab;ik = ab;ik+ Mab;k - Mab;k’b;k = b;k
Repeat till the end of the chain
Start from right, go to left
Repeat till the end of the chain
![Page 45: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/45.jpg)
Belief Propagation on Chains
• A way of computing reparam constants
• Generalizes to chains of any length
• Forward Pass - Start to End
• MAP estimate
• Min-marginals of final variable
• Backward Pass - End to start
• All other min-marginals
Won’t need this .. But good to know
![Page 46: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/46.jpg)
Computational Complexity
• Each constant takes O(|L|)
• Number of constants - O(|E||L|)
O(|E||L|2)
• Memory required ?
O(|E||L|)
![Page 47: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/47.jpg)
Belief Propagation on Trees
Vb
Va
Forward Pass: Leaf Root
All min-marginals are computed
Backward Pass: Root Leaf
Vc
Vd Ve Vg Vh
![Page 48: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/48.jpg)
Outline
• Reparameterization
• Belief Propagation
– Exact MAP for Chains and Trees
– Approximate MAP for general graphs
– Computational Issues and Theoretical Properties
• Tree-reweighted Message Passing
![Page 49: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/49.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
Where do we start? Arbitrarily
a;0
a;1
b;0
b;1
d;0
d;1
c;0
c;1
Reparameterize (a,b)
![Page 50: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/50.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
a;0
a;1
’b;0
’b;1
d;0
d;1
c;0
c;1
Potentials along the red path add up to 0
![Page 51: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/51.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
a;0
a;1
’b;0
’b;1
d;0
d;1
’c;0
’c;1
Potentials along the red path add up to 0
![Page 52: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/52.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
a;0
a;1
’b;0
’b;1
’d;0
’d;1
’c;0
’c;1
Potentials along the red path add up to 0
![Page 53: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/53.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
’a;0
’a;1
’b;0
’b;1
’d;0
’d;1
’c;0
’c;1
Potentials along the red path add up to 0
- a;0
- a;1
Did not obtain min-marginals
![Page 54: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/54.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
’a;0
’a;1
’b;0
’b;1
’d;0
’d;1
’c;0
’c;1
- a;0
- a;1
Reparameterize (a,b) again
![Page 55: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/55.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
’a;0
’a;1
’’b;0
’’b;1
’d;0
’d;1
’c;0
’c;1
Reparameterize (a,b) again
But doesn’t this overcount some potentials?
![Page 56: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/56.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
’a;0
’a;1
’’b;0
’’b;1
’d;0
’d;1
’c;0
’c;1
Reparameterize (a,b) again
Yes. But we will do it anyway
![Page 57: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/57.jpg)
Belief Propagation on Cycles
Va Vb
Vd Vc
’a;0
’a;1
’’b;0
’’b;1
’d;0
’d;1
’c;0
’c;1
Keep reparameterizing edges in some order
No convergence guarantees
![Page 58: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/58.jpg)
Belief Propagation
• Generalizes to any arbitrary random field
• Complexity per iteration ?
O(|E||L|2)
• Memory required ?
O(|E||L|)
![Page 59: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/59.jpg)
Outline
• Reparameterization
• Belief Propagation
– Exact MAP for Chains and Trees
– Approximate MAP for general graphs
– Computational Issues and Theoretical Properties
• Tree-reweighted Message Passing
![Page 60: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/60.jpg)
Computational Issues of BP
Complexity per iteration O(|E||L|2)
Special Pairwise Potentials ab;ik = wabd(|i-k|)
i - k
d
Potts
i - k
d
Truncated Linear
i - k
d
Truncated Quadratic
O(|E||L|) Felzenszwalb & Huttenlocher, 2004
![Page 61: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/61.jpg)
Computational Issues of BP
Memory requirements O(|E||L|)
Half of original BP Kolmogorov, 2006
Some approximations exist
But memory still remains an issue
Yu, Lin, Super and Tan, 2007
Lasserre, Kannan and Winn, 2007
![Page 62: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/62.jpg)
Computational Issues of BP
Order of reparameterization
Randomly
Residual Belief Propagation
In some fixed order
The one that results in maximum change
Elidan et al. , 2006
![Page 63: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/63.jpg)
Summary of BP
Exact for chains
Exact for trees
Approximate MAP for general cases
Convergence not guaranteed
So can we do something better?
![Page 64: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/64.jpg)
Outline
• Reparameterization
• Belief Propagation
• Tree-reweighted Message Passing
– Integer Programming Formulation
– Linear Programming Relaxation and its Dual
– Convergent Solution for Dual
– Computational Issues and Theoretical Properties
![Page 65: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/65.jpg)
Integer Programming Formulation
Va Vb
Label l0
Label l12
5
4
2
0
1 1
0
2
Unary Potentials
a;0 = 5
a;1 = 2
b;0 = 2
b;1 = 4
Labeling
f(a) = 1
f(b) = 0
ya;0 = 0 ya;1 = 1
yb;0 = 1 yb;1 = 0
Any f(.) has equivalent boolean variables ya;i
![Page 66: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/66.jpg)
Integer Programming Formulation
Va Vb
2
5
4
2
0
1 1
0
2
Unary Potentials
a;0 = 5
a;1 = 2
b;0 = 2
b;1 = 4
Labeling
f(a) = 1
f(b) = 0
ya;0 = 0 ya;1 = 1
yb;0 = 1 yb;1 = 0
Find the optimal variables ya;i
Label l0
Label l1
![Page 67: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/67.jpg)
Integer Programming Formulation
Va Vb
2
5
4
2
0
1 1
0
2
Unary Potentials
a;0 = 5
a;1 = 2
b;0 = 2
b;1 = 4
Sum of Unary Potentials
∑a ∑i a;i ya;i
ya;i {0,1}, for all Va, li
∑i ya;i = 1, for all Va
Label l0
Label l1
![Page 68: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/68.jpg)
Integer Programming Formulation
Va Vb
2
5
4
2
0
1 1
0
2
Pairwise Potentials
ab;00 = 0
ab;10 = 1
ab;01 = 1
ab;11 = 0
Sum of Pairwise Potentials
∑(a,b) ∑ik ab;ik ya;iyb;k
ya;i {0,1}
∑i ya;i = 1
Label l0
Label l1
![Page 69: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/69.jpg)
Integer Programming Formulation
Va Vb
2
5
4
2
0
1 1
0
2
Pairwise Potentials
ab;00 = 0
ab;10 = 1
ab;01 = 1
ab;11 = 0
Sum of Pairwise Potentials
∑(a,b) ∑ik ab;ik yab;ik
ya;i {0,1}
∑i ya;i = 1
yab;ik = ya;i yb;k
Label l0
Label l1
![Page 70: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/70.jpg)
Integer Programming Formulation
min ∑a ∑i a;i ya;i + ∑(a,b) ∑ik ab;ik yab;ik
ya;i {0,1}
∑i ya;i = 1
yab;ik = ya;i yb;k
![Page 71: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/71.jpg)
Integer Programming Formulation
min Ty
ya;i {0,1}
∑i ya;i = 1
yab;ik = ya;i yb;k
= [ … a;i …. ; … ab;ik ….]
y = [ … ya;i …. ; … yab;ik ….]
![Page 72: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/72.jpg)
Integer Programming Formulation
min Ty
ya;i {0,1}
∑i ya;i = 1
yab;ik = ya;i yb;k
Solve to obtain MAP labeling y*
![Page 73: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/73.jpg)
Integer Programming Formulation
min Ty
ya;i {0,1}
∑i ya;i = 1
yab;ik = ya;i yb;k
But we can’t solve it in general
![Page 74: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/74.jpg)
Outline
• Reparameterization
• Belief Propagation
• Tree-reweighted Message Passing
– Integer Programming Formulation
– Linear Programming Relaxation and its Dual
– Convergent Solution for Dual
– Computational Issues and Theoretical Properties
![Page 75: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/75.jpg)
Linear Programming Relaxation
min Ty
ya;i {0,1}
∑i ya;i = 1
yab;ik = ya;i yb;k
Two reasons why we can’t solve this
![Page 76: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/76.jpg)
Linear Programming Relaxation
min Ty
ya;i [0,1]
∑i ya;i = 1
yab;ik = ya;i yb;k
One reason why we can’t solve this
![Page 77: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/77.jpg)
Linear Programming Relaxation
min Ty
ya;i [0,1]
∑i ya;i = 1
∑k yab;ik = ∑kya;i yb;k
One reason why we can’t solve this
![Page 78: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/78.jpg)
Linear Programming Relaxation
min Ty
ya;i [0,1]
∑i ya;i = 1
One reason why we can’t solve this
= 1∑k yab;ik = ya;i∑k yb;k
![Page 79: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/79.jpg)
Linear Programming Relaxation
min Ty
ya;i [0,1]
∑i ya;i = 1
∑k yab;ik = ya;i
One reason why we can’t solve this
![Page 80: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/80.jpg)
Linear Programming Relaxation
min Ty
ya;i [0,1]
∑i ya;i = 1
∑k yab;ik = ya;i
No reason why we can’t solve this*
*memory requirements, time complexity
![Page 81: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/81.jpg)
Dual of the LP RelaxationWainwright et al., 2001
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
min Ty
ya;i [0,1]
∑i ya;i = 1
∑k yab;ik = ya;i
![Page 82: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/82.jpg)
Dual of the LP RelaxationWainwright et al., 2001
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
1
2
3
4 5 6
1
2
3
4 5 6
i i =
i ≥ 0
![Page 83: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/83.jpg)
Dual of the LP RelaxationWainwright et al., 2001
1
2
3
4 5 6
q*( 1)
i i =
q*( 2)
q*( 3)
q*( 4) q*( 5) q*( 6)
i q*( i)
Dual of LP
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
i ≥ 0
max
![Page 84: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/84.jpg)
Dual of the LP RelaxationWainwright et al., 2001
1
2
3
4 5 6
q*( 1)
i i
q*( 2)
q*( 3)
q*( 4) q*( 5) q*( 6)
Dual of LP
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
i ≥ 0
i q*( i)max
![Page 85: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/85.jpg)
Dual of the LP RelaxationWainwright et al., 2001
i i
max i q*( i)
I can easily compute q*( i)
I can easily maintain reparam constraint
So can I easily solve the dual?
![Page 86: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/86.jpg)
Outline
• Reparameterization
• Belief Propagation
• Tree-reweighted Message Passing
– Integer Programming Formulation
– Linear Programming Relaxation and its Dual
– Convergent Solution for Dual
– Computational Issues and Theoretical Properties
![Page 87: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/87.jpg)
TRW Message PassingKolmogorov, 2006
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
VaVb Vc
VdVe Vf
VgVh Vi
1
2
3
1
2
3
4 5 6
4 5 6
i i
i q*( i)
Pick a variable Va
![Page 88: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/88.jpg)
TRW Message PassingKolmogorov, 2006
i i
i q*( i)
Vc Vb Va
1c;0
1c;1
1b;0
1b;1
1a;0
1a;1
Va Vd Vg
4a;0
4a;1
4d;0
4d;1
4g;0
4g;1
![Page 89: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/89.jpg)
TRW Message PassingKolmogorov, 2006
1 1 + 4 4 + rest
1 q*( 1) + 4 q*( 4) + K
Vc Vb Va Va Vd Vg
Reparameterize to obtain min-marginals of Va
1c;0
1c;1
1b;0
1b;1
1a;0
1a;1
4a;0
4a;1
4d;0
4d;1
4g;0
4g;1
![Page 90: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/90.jpg)
TRW Message PassingKolmogorov, 2006
1 ’1 + 4 ’4 + rest
Vc Vb Va
’1c;0
’1c;1
’1b;0
’1b;1
’1a;0
’1a;1
Va Vd Vg
’4a;0
’4a;1
’4d;0
’4d;1
’4g;0
’4g;1
One pass of Belief Propagation
1 q*( ’1) + 4 q*( ’4) + K
![Page 91: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/91.jpg)
TRW Message PassingKolmogorov, 2006
1 ’1 + 4 ’4 + rest
Vc Vb Va Va Vd Vg
Remain the same
1 q*( ’1) + 4 q*( ’4) + K
’1c;0
’1c;1
’1b;0
’1b;1
’1a;0
’1a;1
’4a;0
’4a;1
’4d;0
’4d;1
’4g;0
’4g;1
![Page 92: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/92.jpg)
TRW Message PassingKolmogorov, 2006
1 ’1 + 4 ’4 + rest
1 min{ ’1a;0, ’1a;1} + 4 min{ ’4a;0, ’4a;1} + K
Vc Vb Va Va Vd Vg
’1c;0
’1c;1
’1b;0
’1b;1
’1a;0
’1a;1
’4a;0
’4a;1
’4d;0
’4d;1
’4g;0
’4g;1
![Page 93: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/93.jpg)
TRW Message PassingKolmogorov, 2006
1 ’1 + 4 ’4 + rest
Vc Vb Va Va Vd Vg
Compute weighted average of min-marginals of Va
’1c;0
’1c;1
’1b;0
’1b;1
’1a;0
’1a;1
’4a;0
’4a;1
’4d;0
’4d;1
’4g;0
’4g;1
1 min{ ’1a;0, ’1a;1} + 4 min{ ’4a;0, ’4a;1} + K
![Page 94: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/94.jpg)
TRW Message PassingKolmogorov, 2006
1 ’1 + 4 ’4 + rest
Vc Vb Va Va Vd Vg
’’a;0 = 1 ’1a;0+ 4 ’4a;0
1 + 4
’’a;1 = 1 ’1a;1+ 4 ’4a;1
1 + 4
’1c;0
’1c;1
’1b;0
’1b;1
’1a;0
’1a;1
’4a;0
’4a;1
’4d;0
’4d;1
’4g;0
’4g;1
1 min{ ’1a;0, ’1a;1} + 4 min{ ’4a;0, ’4a;1} + K
![Page 95: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/95.jpg)
TRW Message PassingKolmogorov, 2006
1 ’’1 + 4 ’’4 + rest
Vc Vb Va Va Vd Vg
’1c;0
’1c;1
’1b;0
’1b;1
’’a;0
’’a;1
’’a;0
’’a;1
’4d;0
’4d;1
’4g;0
’4g;1
1 min{ ’1a;0, ’1a;1} + 4 min{ ’4a;0, ’4a;1} + K
’’a;0 = 1 ’1a;0+ 4 ’4a;0
1 + 4
’’a;1 = 1 ’1a;1+ 4 ’4a;1
1 + 4
![Page 96: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/96.jpg)
TRW Message PassingKolmogorov, 2006
1 ’’1 + 4 ’’4 + rest
Vc Vb Va Va Vd Vg
’1c;0
’1c;1
’1b;0
’1b;1
’’a;0
’’a;1
’’a;0
’’a;1
’4d;0
’4d;1
’4g;0
’4g;1
1 min{ ’1a;0, ’1a;1} + 4 min{ ’4a;0, ’4a;1} + K
’’a;0 = 1 ’1a;0+ 4 ’4a;0
1 + 4
’’a;1 = 1 ’1a;1+ 4 ’4a;1
1 + 4
![Page 97: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/97.jpg)
TRW Message PassingKolmogorov, 2006
1 ’’1 + 4 ’’4 + rest
Vc Vb Va Va Vd Vg
1 min{ ’’a;0, ’’a;1} + 4 min{ ’’a;0, ’’a;1} + K
’1c;0
’1c;1
’1b;0
’1b;1
’’a;0
’’a;1
’’a;0
’’a;1
’4d;0
’4d;1
’4g;0
’4g;1
’’a;0 = 1 ’1a;0+ 4 ’4a;0
1 + 4
’’a;1 = 1 ’1a;1+ 4 ’4a;1
1 + 4
![Page 98: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/98.jpg)
TRW Message PassingKolmogorov, 2006
1 ’’1 + 4 ’’4 + rest
Vc Vb Va Va Vd Vg
( 1 + 4) min{ ’’a;0, ’’a;1} + K
’1c;0
’1c;1
’1b;0
’1b;1
’’a;0
’’a;1
’’a;0
’’a;1
’4d;0
’4d;1
’4g;0
’4g;1
’’a;0 = 1 ’1a;0+ 4 ’4a;0
1 + 4
’’a;1 = 1 ’1a;1+ 4 ’4a;1
1 + 4
![Page 99: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/99.jpg)
TRW Message PassingKolmogorov, 2006
1 ’’1 + 4 ’’4 + rest
Vc Vb Va Va Vd Vg
( 1 + 4) min{ ’’a;0, ’’a;1} + K
’1c;0
’1c;1
’1b;0
’1b;1
’’a;0
’’a;1
’’a;0
’’a;1
’4d;0
’4d;1
’4g;0
’4g;1
min {p1+p2, q1+q2} min {p1, q1} + min {p2, q2}≥
![Page 100: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/100.jpg)
TRW Message PassingKolmogorov, 2006
1 ’’1 + 4 ’’4 + rest
Vc Vb Va Va Vd Vg
Objective function increases or remains constant
’1c;0
’1c;1
’1b;0
’1b;1
’’a;0
’’a;1
’’a;0
’’a;1
’4d;0
’4d;1
’4g;0
’4g;1
( 1 + 4) min{ ’’a;0, ’’a;1} + K
![Page 101: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/101.jpg)
TRW Message Passing
Initialize i. Take care of reparam constraint
Choose random variable Va
Compute min-marginals of Va for all trees
Node-average the min-marginals
REPEAT
Kolmogorov, 2006
Can also do edge-averaging
![Page 102: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/102.jpg)
Example 1
Va Vb
0
1 1
0
2
5
4
2l0
l1
Vb Vc
0
2 3
1
4
2
6
3
Vc Va
1
4 1
0
6
3
6
4
2 =1 3 =11 =1
5 6 7
Pick variable Va. Reparameterize.
![Page 103: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/103.jpg)
Example 1
Va Vb
-3
-2 -1
-2
5
7
4
2
Vb Vc
0
2 3
1
4
2
6
3
Vc Va
-3
1 -3
-3
6
3
10
7
2 =1 3 =11 =1
5 6 7
Average the min-marginals of Va
l0
l1
![Page 104: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/104.jpg)
Example 1
Va Vb
-3
-2 -1
-2
7.5
7
4
2
Vb Vc
0
2 3
1
4
2
6
3
Vc Va
-3
1 -3
-3
6
3
7.5
7
2 =1 3 =11 =1
7 6 7
Pick variable Vb. Reparameterize.
l0
l1
![Page 105: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/105.jpg)
Example 1
Va Vb
-7.5
-7 -5.5
-7
7.5
7
8.5
7
Vb Vc
-5
-3 -1
-3
9
6
6
3
Vc Va
-3
1 -3
-3
6
3
7.5
7
2 =1 3 =11 =1
7 6 7
Average the min-marginals of Vb
l0
l1
![Page 106: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/106.jpg)
Example 1
Va Vb
-7.5
-7 -5.5
-7
7.5
7
8.75
6.5
Vb Vc
-5
-3 -1
-3
8.75
6.5
6
3
Vc Va
-3
1 -3
-3
6
3
7.5
7
2 =1 3 =11 =1
6.5 6.5 7
Value of dual does not increase
l0
l1
![Page 107: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/107.jpg)
Example 1
Va Vb
-7.5
-7 -5.5
-7
7.5
7
8.75
6.5
Vb Vc
-5
-3 -1
-3
8.75
6.5
6
3
Vc Va
-3
1 -3
-3
6
3
7.5
7
2 =1 3 =11 =1
6.5 6.5 7
Maybe it will increase for Vc
NO
l0
l1
![Page 108: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/108.jpg)
Example 1
Va Vb
-7.5
-7 -5.5
-7
7.5
7
8.75
6.5
Vb Vc
-5
-3 -1
-3
8.75
6.5
6
3
Vc Va
-3
1 -3
-3
6
3
7.5
7
2 =1 3 =11 =1
Strong Tree Agreement
Exact MAP Estimate
f1(a) = 0 f1(b) = 0 f2(b) = 0 f2(c) = 0 f3(c) = 0 f3(a) = 0
l0
l1
![Page 109: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/109.jpg)
Example 2
Va Vb
0
1 1
0
2
5
2
2
Vb Vc
1
0 0
1
0
0
0
0
Vc Va
0
1 1
0
0
3
4
8
2 =1 3 =11 =1
4 0 4
Pick variable Va. Reparameterize.
l0
l1
![Page 110: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/110.jpg)
Example 2
Va Vb
-2
-1 -1
-2
4
7
2
2
Vb Vc
1
0 0
1
0
0
0
0
Vc Va
0
0 1
-1
0
3
4
9
2 =1 3 =11 =1
4 0 4
Average the min-marginals of Va
l0
l1
![Page 111: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/111.jpg)
Example 2
Va Vb
-2
-1 -1
-2
4
8
2
2
Vb Vc
1
0 0
1
0
0
0
0
Vc Va
0
0 1
-1
0
3
4
8
2 =1 3 =11 =1
4 0 4
Value of dual does not increase
l0
l1
![Page 112: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/112.jpg)
Example 2
Va Vb
-2
-1 -1
-2
4
8
2
2
Vb Vc
1
0 0
1
0
0
0
0
Vc Va
0
0 1
-1
0
3
4
8
2 =1 3 =11 =1
4 0 4
Maybe it will decrease for Vb or Vc
NO
l0
l1
![Page 113: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/113.jpg)
Example 2
Va Vb
-2
-1 -1
-2
4
8
2
2
Vb Vc
1
0 0
1
0
0
0
0
Vc Va
0
0 1
-1
0
3
4
8
2 =1 3 =11 =1
f1(a) = 1 f1(b) = 1 f2(b) = 1 f2(c) = 0 f3(c) = 1 f3(a) = 1
f2(b) = 0 f2(c) = 1
Weak Tree Agreement
Not Exact MAP Estimate
l0
l1
![Page 114: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/114.jpg)
Example 2
Va Vb
-2
-1 -1
-2
4
8
2
2
Vb Vc
1
0 0
1
0
0
0
0
Vc Va
0
0 1
-1
0
3
4
8
2 =1 3 =11 =1
Weak Tree Agreement
Convergence point of TRW
l0
l1
f1(a) = 1 f1(b) = 1 f2(b) = 1 f2(c) = 0 f3(c) = 1 f3(a) = 1
f2(b) = 0 f2(c) = 1
![Page 115: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/115.jpg)
Obtaining the Labeling
Only solves the dual. Primal solutions?
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
’ = i i
Fix the label
Of Va
![Page 116: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/116.jpg)
Obtaining the Labeling
Only solves the dual. Primal solutions?
Va Vb Vc
Vd Ve Vf
Vg Vh Vi
’ = i i
Fix the label
Of Vb
Continue in some fixed order
Meltzer et al., 2006
![Page 117: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/117.jpg)
Outline
• Problem Formulation
• Reparameterization
• Belief Propagation
• Tree-reweighted Message Passing
– Integer Programming Formulation
– Linear Programming Relaxation and its Dual
– Convergent Solution for Dual
– Computational Issues and Theoretical Properties
![Page 118: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/118.jpg)
Computational Issues of TRW
• Speed-ups for some pairwise potentials
Basic Component is Belief Propagation
Felzenszwalb & Huttenlocher, 2004
• Memory requirements cut down by half
Kolmogorov, 2006
• Further speed-ups using monotonic chains
Kolmogorov, 2006
![Page 119: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/119.jpg)
Theoretical Properties of TRW
• Always converges, unlike BP
Kolmogorov, 2006
• Strong tree agreement implies exact MAP
Wainwright et al., 2001
• Optimal MAP for two-label submodular problems
Kolmogorov and Wainwright, 2005
ab;00 + ab;11 ≤ ab;01 + ab;10
![Page 120: ICCV2009: MAP Inference in Discrete Models: Part 3](https://reader034.vdocuments.us/reader034/viewer/2022051323/54859c815806b590588b4772/html5/thumbnails/120.jpg)
Summary
• Trees can be solved exactly - BP
• No guarantee of convergence otherwise - BP
• Strong Tree Agreement - TRW-S
• Submodular energies solved exactly - TRW-S
• TRW-S solves an LP relaxation of MAP estimation