eigenproblem and svd - mcmaster universityqiao/courses/cs4xo3/slides/eig.pdfeigenproblem svd...
TRANSCRIPT
Eigenproblem SVD Software Packages
Eigenproblem and SVD
Sanzheng Qiao
Department of Computing and SoftwareMcMaster University
January, 2009
Eigenproblem SVD Software Packages
Outline
1 Eigenvalue ProblemSensitivityComputing EigenvaluesTwo orthogonal transformationsQR DecompositionTridiagonalizationSymmetric QR Method
2 Singular Value Decomposition
3 Software Packages
Eigenproblem SVD Software Packages
Outline
1 Eigenvalue ProblemSensitivityComputing EigenvaluesTwo orthogonal transformationsQR DecompositionTridiagonalizationSymmetric QR Method
2 Singular Value Decomposition
3 Software Packages
Eigenproblem SVD Software Packages
Introduction
Eigenvalue problem:
Ax = λx , yTA = λyT
λ: eigenvaluex , y : right and left eigenvectors respectively
Eigenproblem SVD Software Packages
Introduction
Eigenvalue problem:
Ax = λx , yTA = λyT
λ: eigenvaluex , y : right and left eigenvectors respectivelyDifferent methods:
Is A real or complex?
Is A dense or large and sparse?
Is A structured, e.g., symmetric?
Are all of the eigenvalues needed or just some extremeones?
Are the eigenvectors needed as well?
Eigenproblem SVD Software Packages
Special case
A: real and symmetric, AT = A.
Eigenproblem SVD Software Packages
Special case
A: real and symmetric, AT = A.Spectral radius ρ(A) = max |λi |
Eigenproblem SVD Software Packages
Special case
A: real and symmetric, AT = A.Spectral radius ρ(A) = max |λi |Properties:
All eigenvalues are simple (doesn’t mean distinct).
Suppose eigenpairs (λ1, x1), ..., (λn , xn), then xi areorthogonal (linearly independent), xT
i xj = 0, i 6= j .
A is diagonalizable by orthogonal transformations:X TAX = diag(λ1, ..., λn), X is orthogonal.
Eigenproblem SVD Software Packages
Sensitivity of eigenvalue
SupposeAx = λx , yTA = λyT,
where ‖x‖2 = ‖y‖2 = 1. That is, x and y are respectively thenormalized right and left eigenvectors corresponding to theeigenvalue λ.Let λ̂ be an eigenvalue of the perturbed problem:
(A + E)x̂ = λ̂x̂ .
Eigenproblem SVD Software Packages
Sensitivity of eigenvalue
SupposeAx = λx , yTA = λyT,
where ‖x‖2 = ‖y‖2 = 1. That is, x and y are respectively thenormalized right and left eigenvectors corresponding to theeigenvalue λ.Let λ̂ be an eigenvalue of the perturbed problem:
(A + E)x̂ = λ̂x̂ .
What is the change in the eigenvalue in terms of theperturbation E?
Eigenproblem SVD Software Packages
Sensitivity of eigenvalue (cont.)
Change in eigenvalue:
|λ − λ̂| ≤1
|yTx |‖E‖F,
the absolute condition number is |yTx |−1.
Eigenproblem SVD Software Packages
Sensitivity of eigenvalue (cont.)
Change in eigenvalue:
|λ − λ̂| ≤1
|yTx |‖E‖F,
the absolute condition number is |yTx |−1.Note that it is an upper bound for the absolute error. Thus,small eigenvalues may have large relative errors, even absoluteerrors are small.
Eigenproblem SVD Software Packages
Sensitivity of eigenvalue (cont.)
Change in eigenvalue:
|λ − λ̂| ≤1
|yTx |‖E‖F,
the absolute condition number is |yTx |−1.Note that it is an upper bound for the absolute error. Thus,small eigenvalues may have large relative errors, even absoluteerrors are small.When A is symmetric x = y , then the condition number is one(well conditioned). More precisely,
∑
i
|λi − λ̂i |2 ≤ ‖E‖2
F.
Eigenproblem SVD Software Packages
Sensitivity of Eigenvector
The sensitivity of an eigenvector xi depends on
gap(λi) := minj 6=i
|λi − λj |.
The distance between λi and its closest neighbor.The smaller the gap is, the more sensitive the eigenvector xi isto the perturbation in A.
Eigenproblem SVD Software Packages
Sensitivity of Eigenvector
The sensitivity of an eigenvector xi depends on
gap(λi) := minj 6=i
|λi − λj |.
The distance between λi and its closest neighbor.The smaller the gap is, the more sensitive the eigenvector xi isto the perturbation in A.The eigenvectors corresponding to clustered eigenvalues canbe sensitive to the perturbation in A.
Eigenproblem SVD Software Packages
Sensitivity of Eigenvector
The sensitivity of an eigenvector xi depends on
gap(λi) := minj 6=i
|λi − λj |.
The distance between λi and its closest neighbor.The smaller the gap is, the more sensitive the eigenvector xi isto the perturbation in A.The eigenvectors corresponding to clustered eigenvalues canbe sensitive to the perturbation in A.Eigenvectors corresponding to different eigenvalues can havedifferent condition numbers.
Eigenproblem SVD Software Packages
Example
A =
.9991.001
2
, E =
0 .01 .01.01 0 .01.01 .01 0
.
Two eigenvalues λ1 and λ2 of A are close.The perturbed matrix A + E is symmetric, then the eigenvectormatrix of A + E :
Q̂ =
−.7418 .6706 .0101.6708 .7417 .0101.0007 −.0143 .9999
.
Note that the eigenvector matrix of A is the identity matrix.
Eigenproblem SVD Software Packages
Example (cont.)
Compare the eigenvectors xi of A and those of A + E :
| sin(x1, x̂1)| = | sin(x2, x̂2)| = 0.67,
Eigenproblem SVD Software Packages
Example (cont.)
Compare the eigenvectors xi of A and those of A + E :
| sin(x1, x̂1)| = | sin(x2, x̂2)| = 0.67,
however,dist(span(x1, x2), span(x̂1, x̂2)) = 0.01,
the space spanned by x1 and x2 is close to that spanned by x̂1
and x̂2.
Eigenproblem SVD Software Packages
Example (cont.)
Compare the eigenvectors xi of A and those of A + E :
| sin(x1, x̂1)| = | sin(x2, x̂2)| = 0.67,
however,dist(span(x1, x2), span(x̂1, x̂2)) = 0.01,
the space spanned by x1 and x2 is close to that spanned by x̂1
and x̂2.Note that
x3 =
001
and x̂3 =
.0101
.0101
.9999
are close, since λ3 is well separated from the other two.
Eigenproblem SVD Software Packages
Characteristic polynomial method
Finding the roots of the characteristic polynomial
det(A − λI)
Eigenproblem SVD Software Packages
Characteristic polynomial method
Finding the roots of the characteristic polynomial
det(A − λI)
Problems
Computing the coefficients when n is large is a substantialwork.
The coefficients can be highly sensitive to the perturbationin the matrix.
The roots can be highly sensitive to the perturbation in thecoefficients.
Eigenproblem SVD Software Packages
Characteristic polynomial method
Finding the roots of the characteristic polynomial
det(A − λI)
Problems
Computing the coefficients when n is large is a substantialwork.
The coefficients can be highly sensitive to the perturbationin the matrix.
The roots can be highly sensitive to the perturbation in thecoefficients.
This method can be used for small n (= 2)
Eigenproblem SVD Software Packages
Companion matrix
In fact, one good way of finding the roots of a polynomial
p(λ) = c0 + c1λ + · · · + cn−1λn−1 + λn
is to compute the eigenvalues of the companion matrix
Cn =
0 0 · · · 0 −c0
1 0 · · · 0 −c1
0 1 · · · 0 −c2...
.... . .
......
0 0 · · · 1 −cn−1
using the methods discussed in the following.
Eigenproblem SVD Software Packages
Power method
An arbitrary vector u can be expressed as
u = µ1x1 + µ2x2 + · · · + µnxn
If µ1 6= 0, Aku has almost the same direction as x1 when(λ2/λ1)
k is small.Thus, Rayleigh quotient
λ1 ≈(Aku)TA(Aku)
(Aku)T(Aku)
Eigenproblem SVD Software Packages
Power method
An arbitrary vector u can be expressed as
u = µ1x1 + µ2x2 + · · · + µnxn
If µ1 6= 0, Aku has almost the same direction as x1 when(λ2/λ1)
k is small.Thus, Rayleigh quotient
λ1 ≈(Aku)TA(Aku)
(Aku)T(Aku)
Problems
Computes only (λ1, x1)
Converges slowly when |λ1| ≈ |λ2|
Eigenproblem SVD Software Packages
Inverse power method
Idea:Suppose that µ is an estimate of λk , then λi − µ are theeigenvalues of A − µI, (λi − µ)−1 are the eigenvalues of(A − µI)−1, and (λk − µ)−1 is the dominant eigenvalue of(A − µI)−1.Applying the power method to (A − µI)−1, we can compute xk
and λk .
Eigenproblem SVD Software Packages
Example
Eigenvalues of A: −1, −0.2, 0.5, 1.5Shift µ: 0.8, an estimate of 0.5Eigenvalues of (A − µI)−1: −3.3, −1, −0.6, 1.4
oooo X−3 −2 0 1−1
+ xx x
Check the ratio λ1/λ2
Eigenproblem SVD Software Packages
Example
Eigenvalues of A: −1, −0.2, 0.5, 1.5Shift µ: 0.8, an estimate of 0.5Eigenvalues of (A − µI)−1: −3.3, −1, −0.6, 1.4
oooo X−3 −2 0 1−1
+ xx x
Check the ratio λ1/λ2
Very effective when we have a good estimate for aneigenvalue.
Eigenproblem SVD Software Packages
Givens rotation
Rotate a vector clockwise by an angle θ.
G =
[
c s−s c
]
, c = cos θ, s = sin θ.
Plane rotation can introducing a zero into a vector by rotating itonto the x-axis.
G[
x1
x2
]
=
[
×0
]
,
c =x1
√
x21 + x2
2
, s =x2
√
x21 + x2
2
.
Eigenproblem SVD Software Packages
Algorithm: Givens rotation
[
c s−s c
] [
x1
x2
]
=
[
×0
]
.
if x(2) = 0c =1.0; s = 0.0;
else if abs(x(2)) >= abs(x(1))ct = x(1)/x(2);s = 1/sqrt(1 + ct*ct); c = s*ct;
elset = x(2)/x(1);c = 1/sqrt(1 + t*t); s = c*t;
end
Eigenproblem SVD Software Packages
Householder reflection
Reflecting a vector with respect to the vector perpendicular to u.
H = I − 2uuT, ‖u‖2 = 1.
Eigenproblem SVD Software Packages
Householder reflection
Reflecting a vector with respect to the vector perpendicular to u.
H = I − 2uuT, ‖u‖2 = 1.
Note. H = HT = H−1.
Eigenproblem SVD Software Packages
Householder reflection
Reflecting a vector with respect to the vector perpendicular to u.
H = I − 2uuT, ‖u‖2 = 1.
Note. H = HT = H−1.Introducing zeros: Reflecting a vector onto the x1-axis.
Hx = [α 0 ... 0]T = αe1.
Since ‖Hx‖2 = ‖x‖2, |α| = ‖x‖2.
Eigenproblem SVD Software Packages
The calculation of u
u1 = x1 ± ‖x‖2;u2:n = x2:n;normalize u;
Question
Geometric interpretation of u?
Eigenproblem SVD Software Packages
The calculation of u
u1 = x1 ± ‖x‖2;u2:n = x2:n;normalize u;
Question
Geometric interpretation of u?
What is u − x?
Eigenproblem SVD Software Packages
Algorithm: Householder reflection
(I − σ−1uuT)x = −αe1
alpha = sign(x(1))*norm(x);u(1) = x(1) + alpha;u(2:n) = x(2:n);sigma = alpha*u(1);
Eigenproblem SVD Software Packages
Algorithm: Householder reflection
(I − σ−1uuT)x = −αe1
alpha = sign(x(1))*norm(x);u(1) = x(1) + alpha;u(2:n) = x(2:n);sigma = alpha*u(1);
Question
Why is sigma equal to ‖u‖22?
Eigenproblem SVD Software Packages
Using Givens rotations
Triangularization using one-side orthogonal transformations.
→ × × × ×→ ⊗ × × ×
× × × ×× × × ×
G12−→
→ × × × ×0 × × ×
→ ⊗ × × ×× × × ×
G13−→
→ × × × ×0 × × ×0 × × ×
→ ⊗ × × ×
G14−→
× × × ×→ 0 × × ×→ 0 ⊗ × ×
0 × × ×
G23−→
× × × ×→ 0 × × ×
0 0 × ×→ 0 ⊗ × ×
G24−→
× × × ×0 × × ×
→ 0 0 × ×→ 0 0 ⊗ ×
G34−→
Eigenproblem SVD Software Packages
Using Householder reflections
× × × ×⊗ × × ×⊗ × × ×⊗ × × ×
H1−→
× × × ×0 × × ×0 ⊗ × ×0 ⊗ × ×
H2−→
× × × ×0 × × ×0 0 × ×0 0 ⊗ ×
H3−→
× × × ×0 × × ×0 0 × ×0 0 0 ×
Eigenproblem SVD Software Packages
Using Givens rotations
Tridiagonalizing a symmetric matrix by applying two-sidesymmetric orthogonal transformation.
↓ ↓× × ⊗ ×
→ × × × ×→ ⊗ × × ×
× × × ×
G23−→
↓ ↓× × 0 ⊗
→ × × × ×0 × × ×
→ ⊗ × × ×
G24−→
↓ ↓× × 0 0× × × ⊗
→ 0 × × ×→ 0 ⊗ × ×
G34−→× × 0 0× × × 00 × × ×0 0 × ×
Eigenproblem SVD Software Packages
Using Householder transformations
× × ⊗ ⊗× × × ×⊗ × × ×⊗ × × ×
H1−→
× × 0 0× × × ⊗0 × × ×0 ⊗ × ×
H2−→
× × 0 0× × × 00 × × ×0 0 × ×
Eigenproblem SVD Software Packages
Basic idea
Generate a sequence
A0 = A, A1, ..., Ak+1
Ak+1 = QTk AQk =
[
B ssT µ
]
where s is small and Qk is orthogonal, i.e., QTk = Q−1
k (so Ak+1
and A have the same eigenvalues).
Eigenproblem SVD Software Packages
Basic idea
Generate a sequence
A0 = A, A1, ..., Ak+1
Ak+1 = QTk AQk =
[
B ssT µ
]
where s is small and Qk is orthogonal, i.e., QTk = Q−1
k (so Ak+1
and A have the same eigenvalues).
Since s is small, µ is an approximation of an eigenvalue ofAk+1;
Since Ak+1 is similar to A, µ is an approximation of aneigenvalue of A;
Deflate Ak+1 and repeat the procedure on B. Size isreduced by one.
Eigenproblem SVD Software Packages
Structure of Qk
What does Qk look like?If the last column of Qk is an eigenvector x of A, Qk = [Pk x ],then
QTk AQk =
[
PTk
xT
]
A [Pk x ]
=
[
PTk
xT
]
[APk λx ]
=
[
B 00T λ
]
Eigenproblem SVD Software Packages
Structure of Qk
What does Qk look like?If the last column of Qk is an eigenvector x of A, Qk = [Pk x ],then
QTk AQk =
[
PTk
xT
]
A [Pk x ]
=
[
PTk
xT
]
[APk λx ]
=
[
B 00T λ
]
Construct Qk so that its last column is an approximation of aneigenvector of A.
Eigenproblem SVD Software Packages
Constructing Qk
How do we get an approximation q of an eigenvector x of A(Ax = λx)?
Eigenproblem SVD Software Packages
Constructing Qk
How do we get an approximation q of an eigenvector x of A(Ax = λx)?One step of the inverse power method: Solve for q in(A − µI)q = en, where µ is an estimate for an eigenvalue of A.(How to get µ? Later.)
Eigenproblem SVD Software Packages
Constructing Qk
How do we get an approximation q of an eigenvector x of A(Ax = λx)?One step of the inverse power method: Solve for q in(A − µI)q = en, where µ is an estimate for an eigenvalue of A.(How to get µ? Later.)How do we construct an orthogonal Q whose last column is q?
Eigenproblem SVD Software Packages
Constructing Qk (cont.)
If (A − µI) = QR is the QR decomposition and q is the lastcolumn of Q, then
qT(A − µI) = qTQR = RnneTn .
Thus, after normalizing,
(A − µI)q = en.
Note that A is symmetric.
Eigenproblem SVD Software Packages
Constructing Qk (cont.)
If (A − µI) = QR is the QR decomposition and q is the lastcolumn of Q, then
qT(A − µI) = qTQR = RnneTn .
Thus, after normalizing,
(A − µI)q = en.
Note that A is symmetric.Method:Get an approximation µ of an eigenvalue;QR decomposition (A − µI) = QR.
Eigenproblem SVD Software Packages
Generating Ai
But, we want
similarity transformations of A, not A − µI;
to carry on and improve accuracy.
Eigenproblem SVD Software Packages
Generating Ai
But, we want
similarity transformations of A, not A − µI;
to carry on and improve accuracy.
Idea:A − µI = QR.RQ = QT(A − µI)Q = QTAQ − µI.RQ + µI = QTAQ is similar to A.
Eigenproblem SVD Software Packages
QR Algorithm (one step)
repeatchoose a shift mu;QR decomposition A - mu*I = QR;A = RQ + mu*I;
until converges (A(n,1:n-1) small)
Eigenproblem SVD Software Packages
QR Algorithm (one step)
repeatchoose a shift mu;QR decomposition A - mu*I = QR;A = RQ + mu*I;
until converges (A(n,1:n-1) small)
Problem
If A is full, it is very expensive (one QR decomposition eachiteration).
Eigenproblem SVD Software Packages
Improving efficiency
SolutionPreprocess A to QTAQ = T tridiagonal using Householdertransformations or Givens rotations.
× × × ×× × × ×× × × ×× × × ×
→
× × 0 0× × × ×0 × × ×0 × × ×
Eigenproblem SVD Software Packages
Improving efficiency (cont.)
Can we maintain the tridiagonal structure?T0 is tridiagonal;T0 − µI is tridiagonal;T0 − µI = QR, R is upper triangular andQ is upper Hessenberg (one subdiagonal);T1 = RQ + µI is upper Hessenberg;Since T1 = QTT0Q is symmetric, T1 is tridiagonal!
Eigenproblem SVD Software Packages
Implicit QR method
Improve efficiency even more.Compute Tk+1 from Tk without explicitly computing the QRdecomposition and the product RQ.
Eigenproblem SVD Software Packages
Implicit QR method
Improve efficiency even more.Compute Tk+1 from Tk without explicitly computing the QRdecomposition and the product RQ.Find a rotation G such that
G[
α1 − µβ1
]
=
[
×0
]
Note that this is the first rotation in the QR decomposition ofT − µI.
Eigenproblem SVD Software Packages
Implicit QR method (cont.)
We know that the new T is QTTQ. Apply G to both sides of T :
α1 β1 0 0β1 α2 β2 00 β2 α3 β3
0 0 β3 α4
→
× × × 0× × × 0× × α3 β3
0 0 β3 α4
We also know that QTTQ is tridiagonal. Restore tridiagonalstructure:
× × ⊗ 0× × × 0⊗ × α3 β3
0 0 β3 α4
→
× × 0 0× × × ⊗0 × × ×0 ⊗ × α4
Eigenproblem SVD Software Packages
Implicit QR method (cont.)
We know that the new T is QTTQ. Apply G to both sides of T :
α1 β1 0 0β1 α2 β2 00 β2 α3 β3
0 0 β3 α4
→
× × × 0× × × 0× × α3 β3
0 0 β3 α4
We also know that QTTQ is tridiagonal. Restore tridiagonalstructure:
× × ⊗ 0× × × 0⊗ × α3 β3
0 0 β3 α4
→
× × 0 0× × × ⊗0 × × ×0 ⊗ × α4
This Q, a product of Givens rotations, is essentially the sameas the Q in the QR decomposition, since Q is determined by itsfirst column.
Eigenproblem SVD Software Packages
Choosing the shift
The eigenvalue of the trailing 2-by-2 submatrix[
αn−1 βn−1
βn−1 αn
]
that is close to αn. (Wilkinson)
Eigenproblem SVD Software Packages
Choosing the shift
The eigenvalue of the trailing 2-by-2 submatrix[
αn−1 βn−1
βn−1 αn
]
that is close to αn. (Wilkinson)As iteration continues, some βi , say βn−1, becomes small, thenαn is a good approximation of an eigenvalue.
Eigenproblem SVD Software Packages
Choosing the shift
The eigenvalue of the trailing 2-by-2 submatrix[
αn−1 βn−1
βn−1 αn
]
that is close to αn. (Wilkinson)As iteration continues, some βi , say βn−1, becomes small, thenαn is a good approximation of an eigenvalue.Why 2-by-2 submatrix instead of αn?Heuristically, it is more effective especially in the beginning.
Eigenproblem SVD Software Packages
Example
A =
1 2 3 42 1 2 33 2 1 24 3 2 1
After tridiagonalization
1.0000 −5.3852 0 0−5.3852 5.1379 −1.9952 0
0 −1.9952 −1.3745 0.28950 0 0.2895 −0.7634
Eigenproblem SVD Software Packages
Example (cont.)
µ β1 β2 β3
0 −5.3852 −1.9952 0.28951 −0.6480 3.8161 0.2222 −0.04942 −0.5859 1.2271 0.0385 10−5
3 −0.5858 0.3615 0.0070 converge4 −1.0990 0.0821 10−10
5 −1.0990 0.0186 converge
Eigenproblem SVD Software Packages
Outline
1 Eigenvalue ProblemSensitivityComputing EigenvaluesTwo orthogonal transformationsQR DecompositionTridiagonalizationSymmetric QR Method
2 Singular Value Decomposition
3 Software Packages
Eigenproblem SVD Software Packages
Definition
A = UΣV T
A: m-by-n real matrix (m ≥ n)U: m-by-m orthogonalV : n-by-n orthogonalΣ: diagonal,diag(σi),σ1 ≥ · · · ≥ σr > σr+1 = · · · = σn = 0Singular values: σi
Left singular vectors: columns of URight singular vectors : columns of VAnother form:
Avk = σkuk , k = 1, ..., n,
that is, uk is normalized Avk .
Eigenproblem SVD Software Packages
Properties
rank(A) = r .
u1, ..., ur form an orthonormal basis for the column spaceof A.
v1, ..., vr form an orthonormal basis for the row space of A.
ur+1, ..., um form an orthonormal basis for the null space ofAT.
vr+1, ..., vn form an orthonormal basis for the null space ofA.
If σn > 0 (A is of full rank),
cond(A) =σ1
σn
Eigenproblem SVD Software Packages
Example
A =
1 6 112 7 123 8 134 9 145 10 15
U =
0.355 −0.689 0.541 0.193 0.2650.399 −0.376 −0.802 −0.113 0.2100.443 −0.062 0.160 −0.587 −0.6560.487 0.251 −0.079 0.742 −0.3780.531 0.564 0.180 −0.235 0.559
Eigenproblem SVD Software Packages
Example (cont.)
Σ =
35.127 0 00 2.465 00 0 00 0 00 0 0
V =
0.202 0.890 0.4080.517 0.257 −0.8160.832 −0.376 0.408
Eigenproblem SVD Software Packages
A compact form
U =
0.355 −0.6890.399 −0.3760.443 −0.0620.487 0.2510.531 0.564
Σ =
[
35.127 00 2.465
]
V =
0.202 0.8900.517 0.2570.832 −0.376
Eigenproblem SVD Software Packages
Geometric interpretation
Transformation A: x → Ax
σ1 ≥‖Ax‖‖x‖
≥ σn
Eigenproblem SVD Software Packages
Application: Linear least-squares problem
minx
‖Ax − b‖22
also called linear regression problem in statistics.SVD: A = UΣV T
‖Ax − b‖22 = ‖Σz − d‖2
2
whered = UTb z = V Tx
Eigenproblem SVD Software Packages
Application: LS problem (cont.)
Solution
zj =
{
djσj
if σj 6= 0
anything if σj = 0
Usually, we setzj = 0 if σj = 0
for minimum norm solution.
Eigenproblem SVD Software Packages
Example
b = [4 5 5 5 5]T
35.127z1 ≈ 10.716
2.465z2 ≈ −0.872
0z3 ≈ −0.541
0 ≈ −0.193
0 ≈ −0.265
Setting z3 = 0, we get
x =
−0.2530.0670.387
, Ax =
4.4064.6074.8085.0095.210
Eigenproblem SVD Software Packages
Computing the SVD
Relation with eigenvalue decomposition:
ATAvk = σ2kvk and AATuk = σ2
kuk .
Eigenproblem SVD Software Packages
Computing the SVD
Relation with eigenvalue decomposition:
ATAvk = σ2kvk and AATuk = σ2
kuk .
1 Computing the eigenvalue decomposition of ATA to obtainV and σ2
i ;1 Bidiagonalize A using Householder
transformations (A → B is upperbidiagonal and BTB tridiagonal);
2 Compute the eigenvalue decomposition of BTB using theQR method without explicitly multiplying BT and B;
2 Keep B upper bidiagonal and computeboth U and V .
Eigenproblem SVD Software Packages
Outline
1 Eigenvalue ProblemSensitivityComputing EigenvaluesTwo orthogonal transformationsQR DecompositionTridiagonalizationSymmetric QR Method
2 Singular Value Decomposition
3 Software Packages
Eigenproblem SVD Software Packages
Software packages
EISPACK rg, rs, svd
IMSL evcrg, evcsf, lsvrr
LAPACK sgeev, ssyev, sgesvd
LINPACK ssvdc
MATLAB eig, svd
NAG f02agf, f02abf, f02wef
Octave eig, svd
Eigenproblem SVD Software Packages
Summary
Problem setting: Symmetric eigenvalueproblem
Power method: Finding the largest eigenvalue and thecorresponding eigenvector
Inverse power method: Finding an eigenvalue given itsestimate
QR method: Finding an eigenvalue using the QRfactorization, preprocess (tridiagonalization), implicitmethod, and choosing the shift
Singular value decomposition: Definition and applications