Download - Numerical Analysis – Linear Equations(II)
![Page 1: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/1.jpg)
Numerical Analysis – Numerical Analysis – Linear Equations(II)Linear Equations(II)
Hanyang University
Jong-Il Park
![Page 2: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/2.jpg)
Singular Value Decomposition(SVD)Singular Value Decomposition(SVD)
Why SVD? Gaussian Elim. and LU Decomposition
fail to give satisfactory results for singular or numerically near singular matrices
SVD can cope with over- or under-determined problems
SVD constructs orthonormal basis vectors
![Page 3: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/3.jpg)
What is What is SVD?SVD?
Any MxN matrix A whose number of rows M is greater than or equal to its number of columns N, can be written as the product of a MxN column-orthogonal matrix U, an NxN diagonal matrix W with positive or zero elements(the singular values), and the transpose of an NxN orthogonal matrix V:
A = U W VT
![Page 4: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/4.jpg)
Properties of SVDProperties of SVD
Orthonormality UTU=I => U-1=UT VTV=I => V-1=VT
Uniqueness The decomposition can always be done,
no matter how singular the matrix is, and it is almost unique.
![Page 5: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/5.jpg)
SVD of a square matrixSVD of a square matrix
SVD of a Square Matrix
columns of U an orthonormal set of basis vectors
columns of V whose corresponding wj's are zero
an orthonormal basis for the nullspace
![Page 6: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/6.jpg)
Reminding basic concept in linear Reminding basic concept in linear algebraalgebra
![Page 7: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/7.jpg)
Homogeneous equationHomogeneous equation
Homegeneous equations (b=0) + A is singular
Any columns of V whose corresponding wj is zero yields a solution
![Page 8: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/8.jpg)
Nonhomogeneous equationNonhomogeneous equation
Nonhomegeneous eq. with singular A
where we replace 1/wj by zero if wj=0
※ The solution x obtained by this method is the solution vector of the smallest length.
※ If b is not in the range of A SVD find the solution SVD find the solution x x in the in the least-square senseleast-square sense, i.e. x which minimize r = |Ax-b|
![Page 9: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/9.jpg)
SVD solution - conceptSVD solution - concept
![Page 10: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/10.jpg)
SVD – under/over-determined problemsSVD – under/over-determined problems
SVD for Fewer Equations than Unknowns
SVD for More Equations than Unknowns SVD yields the least-square solution In general, non-singular
![Page 11: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/11.jpg)
Applications of SVDApplications of SVD
Applications 1. Constructing an orthonormal basis
M-dimensional vector space Problem: Given N vectors,
find an orthonormal basis
Solution: Columns of the matrix U are the desired orthonormal basis
2. Approximation of Matrices
![Page 12: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/12.jpg)
Vector normVector norm
![Page 13: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/13.jpg)
ll22 norm norm
![Page 14: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/14.jpg)
l norml norm8
![Page 15: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/15.jpg)
Distance between vectorsDistance between vectors
![Page 16: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/16.jpg)
Natural matrix normNatural matrix norm
Eg.
![Page 17: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/17.jpg)
l norm of a matrixl norm of a matrix8
![Page 18: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/18.jpg)
Eigenvalues and eigenvectorsEigenvalues and eigenvectors
* To be discussed later in detail.
![Page 19: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/19.jpg)
Spectral radiusSpectral radius
![Page 20: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/20.jpg)
Convergent matrix equivalencesConvergent matrix equivalences
![Page 21: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/21.jpg)
Convergence of a sequenceConvergence of a sequence An important connection between the eigen
values of the matrix T and the expectation that the iterative method will converge
spectral radius
![Page 22: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/22.jpg)
Iterative Methods - Jacobi IterationIterative Methods - Jacobi Iteration
Jacobi Iteration
![Page 23: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/23.jpg)
Jacobi IterationJacobi Iteration
Initial guess
Convergence Condition The Jacobi iteration is convergent, irrespective of an initial guess, if the matrix A is diagonal-dominant:
![Page 24: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/24.jpg)
Eg. Jacobi iterationEg. Jacobi iteration
![Page 25: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/25.jpg)
![Page 26: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/26.jpg)
Gauss-Seidel IterationGauss-Seidel Iteration
Idea Utilize recently updated
Iteration formula
Convergence Condition The same as the Jacobi iteration
Advantage over Jacobi iteration Fast convergence
![Page 27: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/27.jpg)
Eg. Gauss-Seidel iterationEg. Gauss-Seidel iteration
![Page 28: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/28.jpg)
Jacobi vs. Gauss-SeidelJacobi vs. Gauss-Seidel Comparison: Eg. 1 vs. Eg. 2
Eg. 1 Jacobi
Eg. 2 Gauss-SeidelFaster convergence
![Page 29: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/29.jpg)
Variation of Gauss-Seidel IterationVariation of Gauss-Seidel Iteration
Successive Over Relaxation(SOR) fast convergence Well-suited for linear problem
Successive Under Relaxation(SUR) slow convergence stable Well-suited for nonlinear problem
![Page 30: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/30.jpg)
Eg. Gauss-Seidel vs. SOREg. Gauss-Seidel vs. SOR
![Page 31: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/31.jpg)
((cont.)cont.)
Faster convergence
![Page 32: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/32.jpg)
Iterative Improvement of a SolutionIterative Improvement of a Solution Iterative improvement
exact solution contaminated solution wrong product Algorithm derivation
since
Solving the equation to get the improvement
Repeat this procedure until a convergence
![Page 33: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/33.jpg)
Numerical Aspect of Iterative Numerical Aspect of Iterative ImprovementImprovement Numerical aspect
Using LU decomposition Once we get a decomposition, just a
substitution using will give the incremental correction
(Refer to mprove() in p.56 of NR in C)
Measure of ill-conditioning A’: normalized matrix If A’-1 is large ill-conditioned
Read Box 10.1
![Page 34: Numerical Analysis – Linear Equations(II)](https://reader036.vdocuments.us/reader036/viewer/2022081505/5681588c550346895dc5eb05/html5/thumbnails/34.jpg)
Application: Circuit analysisApplication: Circuit analysis Kirchhoff’s current and voltage law
Current rule: 4 nodesVoltage rule: 2 meshes
6 unknowns, 6 equations