non-linear least squares and sparse matrix techniques: fundamentals
DESCRIPTION
Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals. Richard Szeliski Microsoft Research UW-MSR Course on Vision Algorithms CSE/EE 577, 590CV, Spring 2004. Readings. Press et al. , Numerical Recipes, Chapter 15 (Modeling of Data) - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/1.jpg)
Non-Linear Least Squares and Sparse Matrix Techniques:
Fundamentals
Richard SzeliskiMicrosoft Research
UW-MSR Course onVision Algorithms
CSE/EE 577, 590CV, Spring 2004
![Page 2: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/2.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 2
Readings• Press et al., Numerical Recipes, Chapter 15 (Modeling of Data) • Nocedal and Wright, Numerical Optimization, Chapter 10
(Nonlinear Least-Squares Problems, pp. 250-273) • Shewchuk, J. R. An Introduction to the Conjugate Gradient
Method Without the Agonizing Pain.• Bathe and Wilson, Numerical Methods in Finite Element
Analysis, pp.695-717 (sec. 8.1-8.2) and pp.979-987 (sec. 12.2)• Golub and VanLoan, Matrix Computations. Chapters 4, 5, 10. • Nocedal and Wright, Numerical Optimization. Chapters 4 and 5.• Triggs et al., Bundle Adjustment – A modern synthesis.
Workshop on Vision Algorithms, 1999.
![Page 3: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/3.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 3
Outline
Nonlinear Least Squares• simple application (motivation)• linear (approx.) solution and least squares• normal equations and pseudo-inverse• LDLT, QR, and SVD decompositions• correct linearization and Jacobians• iterative solution, Levenberg-Marquardt• robust measurements
![Page 4: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/4.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 4
Outline
Sparse matrix techniques• simple application (structure from motion)• sparse matrix storage (skyline)• direct solution: LDLT with minimal fill-in• larger application (surface/image fitting)• iterative solution: gradient descent• conjugate gradient• preconditioning
![Page 5: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/5.jpg)
Non-linear Least Squares
![Page 6: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/6.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 6
Triangulation – a simple example
Problem: Given some image points {(uj,vj)}in correspondence across two or more images (taken from calibrated cameras cj), compute the 3D location X cj
X
(uj,vj)
![Page 7: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/7.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 7
Image formation equations
u
(Xc,Yc,Zc)
ucf
![Page 8: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/8.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 8
Simplified modelLet R=I (known rotation), f=1, Y = vj = 0 (flatland)
How do we solve this set of equations (constraints) to find the best (X,Z)? (xj,zj)
X
uj
![Page 9: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/9.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 9
“Linearized” modelBring the denominator over to the LHS
or
(Measures horizontal distance to each line equation.)How do we solve this set of equations (constraints)?
(xj,zj)
X
uj
![Page 10: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/10.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 10
Linear regressionOverconstrained set of linear equations
orJx = r
whereJj0=1, Jj1 = -uj
is the Jacobian andrj = xj-ujzj
is the residual
(xj,zj)
X
uj
![Page 11: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/11.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 11
Normal Equations
How do we solve Jx = r?Least squares: arg minx ║Jx-r║2
E = ║Jx-r║2= (Jx-r)T(Jx-r)= xTJTJx – 2xTJTr – rTr
∂E/∂x = 2(JTJ)x – 2JTr = 0(JTJ)x = JTr normal equations A x = b ” (A is Hessian)x = [(JTJ)-1JT]r pseudoinverse
![Page 12: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/12.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 12
LDLT factorization
Factor A = LDLT, where L is lower triangular with 1s on diagonal, D is diagonal
How?L is formed from columns of Gaussian elimination
Perform (similar) forward and backward elimination/substitutionLDLTx = b, DLTx = L-1b, LTx = D-1L-1b,x = L-TD-1L-1b
![Page 13: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/13.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 13
LDLT factorization – details
![Page 14: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/14.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 14
LDLT factorization – details
![Page 15: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/15.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 15
LDLT factorization – details
![Page 16: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/16.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 16
LDLT factorization – details
![Page 17: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/17.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 17
LDLT factorization – details
![Page 18: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/18.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 18
LDLT factorization – details
![Page 19: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/19.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 19
LDLT factorization – details
![Page 20: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/20.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 20
LDLT factorization – details
![Page 21: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/21.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 21
LDLT factorization – details
![Page 22: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/22.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 22
LDLT factorization – details
![Page 23: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/23.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 23
LDLT factorization – details
![Page 24: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/24.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 24
LDLT and Cholesky
Variant: Cholesky: A = GGT, where G = LD1/2
(involves scalar √)Advantages: more stable than Gaussian
eliminationDisadvantage: less stable than QR: (cond. #)2
Complexity: (m+n/3)n2 flops
![Page 25: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/25.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 25
QR decomposition
Alternative solution for Jx = rFind an orthogonal matrix Q s.t.
J = Q R, where R is upper triangularQ R x = rR x = QTr solve for x using back subst.
Q is usu. computed using Householder matrices, Q = Q1…Qm, Qj = I – βvjvj
T
Advantages: sensitivity / condition numberComplexity: 2n2(m-n/3) flops
![Page 26: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/26.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 26
SVD
Most stable way to solve system Jx = r.J = UTΣ V, where U and V are orthogonal
Σ is diagonal (singular values)
Advantage: most stable (very ill conditioned problems)
Disadvantage: slowest (iterative solution)
![Page 27: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/27.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 27
“Linearized” model – revisited
Does the “linearized” model
which measures horizontal distance to each line give the optimal estimate?
No!(xj,zj)
X
uj
![Page 28: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/28.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 28
Properly weighted modelWe want to minimize errors in the measured quantities
Closer cameras (smaller denominators) have moreweight / influence.Weight each “linearized” equation by current denominator?
(xj,zj)
X
uj
![Page 29: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/29.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 29
Optimal estimation
Feature measurement equations
Likelihood of (X,Z) given {ui,xj,zj}
![Page 30: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/30.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 30
Non-linear least squares
Log likelihood of (x,z) given {ui,xj,zj}
How do we minimize E?Non-linear regression (least squares), because
ûi are non-linear functions of {ui,xj,zj}
![Page 31: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/31.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 31
Levenberg-Marquardt
Iterative non-linear least squares• Linearize measurement equations
• Substitute into log-likelihood equation: quadratic cost function in (Δx,Δz)
![Page 32: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/32.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 32
Levenberg-Marquardt
Linear regression (sub-)problem:
with
ûi
Similar to weighted regression, but not quite.
![Page 33: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/33.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 33
What if it doesn’t converge?• Multiply diagonal by (1 + λ), increase λ
until it does• Halve the step size (my favorite)• Use line search• Other trust region methods
[Nocedal & Wright]
Levenberg-Marquardt
![Page 34: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/34.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 34
Other issues:• Uncertainty analysis: covariance Σ = A-1
• Is maximum likelihood the best idea?• How to start in vicinity of global minimum?• What about outliers?
Levenberg-Marquardt
![Page 35: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/35.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 35
Robust regression
Data often have outliers (bad measurements)• Use robust penalty applied
to each set of jointmeasurements
[Black & Rangarajan, IJCV’96]• For extremely bad data, use random sampling
[RANSAC, Fischler & Bolles, CACM’81]
![Page 36: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/36.jpg)
Sparse Matrix Techniques
Direct methods
![Page 37: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/37.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 37
Structure from motionGiven many points in correspondence across several
images, {(uij,vij)}, simultaneously compute the 3D location Xi and camera (or motion) parameters (K, Rj, tj)
Two main variants: calibrated, and uncalibrated (sometimes associated with Euclidean and projective reconstructions)
![Page 38: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/38.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 38
Bundle Adjustment
Simultaneous adjustment of bundles of rays (photogrammetry)
What makes this non-linear minimization hard?• many more parameters: potentially slow• poorer conditioning (high correlation)• potentially lots of outliers• gauge (coordinate) freedom
![Page 39: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/39.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 39
Simplified modelAgain, R=I (known rotation),
f=1, Z = vj = 0 (flatland)
This time, we have to solve for all of the parameters {(Xi,Zi), (xj,zj)}.
(xj,zj)
Xi
uij
![Page 40: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/40.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 40
Lots of parameters: sparsity
Only a few entries in Jacobian are non-zero
![Page 41: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/41.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 41
Sparse LDLT / Cholesky
First used in finite element analysis [Bathe…]Applied to SfM by [Szeliski & Kang 1994]
structure | motion fill-in
![Page 42: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/42.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 42
Skyline storage [Bathe & Wilson]
![Page 43: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/43.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 43
Sparse matrices–common shapes
Banded (tridiagonal), arrowhead, multi-banded
: fill-in
Computational complexity: O(n b2)Application to computer vision:
• snakes (tri-diagonal)• surface interpolation (multi-banded)• deformable models (sparse)
![Page 44: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/44.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 44
Sparse matrices – variable reordering
Triggs et al. – Bundle Adjustment
![Page 45: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/45.jpg)
Sparse Matrix Techniques
Iterative methods
![Page 46: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/46.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 46
Two-dimensional problems
Surface interpolation and Poisson blending
![Page 47: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/47.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 47
Poisson blending
![Page 48: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/48.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 48
Poisson blending
→ multi-banded (sparse) system
![Page 49: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/49.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 49
One-dimensional example
Simplified 1-D height/slope interpolation
tri-diagonal system (generalized snakes)
![Page 50: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/50.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 50
Direct solution of 2D problems
Multi-banded Hessian
: fill-in
Computational complexity: n x m imageO(nm m2)
… too slow!
![Page 51: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/51.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 51
Iterative techniques
Gauss-Seidel and JacobiGradient descentConjugate gradientNon-linear conjugate gradientPreconditioning
… see Shewchuck’s TR
![Page 52: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/52.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 52
Conjugate gradient
… see Shewchuck’s TR for rest of notes …
![Page 53: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/53.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 53
Iterative vs. direct
Direct better for 1D problems and relatively sparse general structures
• SfM where #points >> #frames
Iterative better for 2D problems• More amenable to parallel (GPU?)
implementation• Preconditioning helps a lot (next lecture)
![Page 54: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/54.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 54
Monday’s lecture (Applications)
Preconditioning• Hierarchical basis functions (wavelets)• 2D applications:
interpolation, shape-from-shading,HDR, Poisson blending,others (rotoscoping?)
![Page 55: Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals](https://reader036.vdocuments.us/reader036/viewer/2022081505/56815a91550346895dc80969/html5/thumbnails/55.jpg)
4/30/2004 NLS and Sparse Matrix Techniques 55
Monday’s lecture (Applications)
Structure from motion• Alternative parameterizations (object-
centered)• Conditioning and linearization problems• Ambiguities and uncertainties• New research: map correlation