linear algebra. session 9 - texas a&m universityroquesol/math_304_fall_2019_session_9.pdf ·...
Post on 20-May-2020
2 Views
Preview:
TRANSCRIPT
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Linear Algebra. Session 9
Dr. Marco A Roque Sol
10 / 25 / 2018
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section
we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again
a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n
linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations
with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where
the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix
A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A
is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued.
If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek
solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutions
of the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form
x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt ,
then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then
it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that
λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be
an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand
v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v
a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector
of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the
coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case,
λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ
is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex,
we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have
complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues and
eigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors
always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear
in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate.
Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus,
if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν;
v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
In this section we consider again a system of n linear homogeneousequations with constant coefficients
X′ = AX
where the coefficient matrix A is real-valued. If we seek solutionsof the form x = veλt , then it follows that λ must be an eigenvalueand v a corresponding eigenvector of the coefficient matrix A.
In the case, λ is complex, we have complex eigenvalues andeigenvectors always appear in complex-conjugate. Thus, if wehave that
λ±k = µ± i ν; v±k = a± i b
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate
eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors
of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A,
then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued
solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions,
but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but
taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t =
eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition,
then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)
X2(t) =1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two complex-conjugate eigenvalues and eigenvectors of thematrix A, then
X±(t) = e(µ±i ν)t (a± i b)
are complex-valued solutions, but taking in account that
e(µ±i ν)t = eµt (cos(νt)± i sin(νt))
and the principle of superposition, then we have that
X1(t) =1
2
(X+(t) + X−(t)
)X2(t) =
1
2i
(X+(t)− X−(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two (real) solutions !!!
X1(t) = eµt (acos(νt)− bsin(νt))
X2(t) = eµt (acos(νt) + bsin(νt))
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two
(real) solutions !!!
X1(t) = eµt (acos(νt)− bsin(νt))
X2(t) = eµt (acos(νt) + bsin(νt))
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two (real) solutions !!!
X1(t) = eµt (acos(νt)− bsin(νt))
X2(t) = eµt (acos(νt) + bsin(νt))
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two (real) solutions !!!
X1(t) = eµt (acos(νt)− bsin(νt))
X2(t) = eµt (acos(νt) + bsin(νt))
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
are two (real) solutions !!!
X1(t) = eµt (acos(νt)− bsin(νt))
X2(t) = eµt (acos(νt) + bsin(νt))
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve
the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find
the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues
of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix
A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.6
Solve the following ODE
x′ = Ax =
3 1 10 2 10 −1 2
x
Solution
Let’s find the eigenvalues of the matrix A
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
|A− λI| =
∣∣∣∣∣∣3− λ 1 1
0 2− λ 10 −1 2− λ
∣∣∣∣∣∣ = 0
(3− λ)
∣∣∣∣2− λ 1−1 2− λ
∣∣∣∣ =
(3− λ)(λ2 − 4λ+ 5) = 0 =⇒
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
|A− λI| =
∣∣∣∣∣∣3− λ 1 1
0 2− λ 10 −1 2− λ
∣∣∣∣∣∣ = 0
(3− λ)
∣∣∣∣2− λ 1−1 2− λ
∣∣∣∣ =
(3− λ)(λ2 − 4λ+ 5) = 0 =⇒
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
|A− λI| =
∣∣∣∣∣∣3− λ 1 1
0 2− λ 10 −1 2− λ
∣∣∣∣∣∣ = 0
(3− λ)
∣∣∣∣2− λ 1−1 2− λ
∣∣∣∣ =
(3− λ)(λ2 − 4λ+ 5) = 0 =⇒
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
|A− λI| =
∣∣∣∣∣∣3− λ 1 1
0 2− λ 10 −1 2− λ
∣∣∣∣∣∣ = 0
(3− λ)
∣∣∣∣2− λ 1−1 2− λ
∣∣∣∣ =
(3− λ)(λ2 − 4λ+ 5) = 0 =⇒
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
|A− λI| =
∣∣∣∣∣∣3− λ 1 1
0 2− λ 10 −1 2− λ
∣∣∣∣∣∣ = 0
(3− λ)
∣∣∣∣2− λ 1−1 2− λ
∣∣∣∣ =
(3− λ)(λ2 − 4λ+ 5) = 0 =⇒
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2,
λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2=
2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = 2, λ2,3 =4±
√16− (4)(5)
2= 2± i
If λ1 = 3, then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
0 1 10 −1 10 −1 −1
v1v2v3
=
0 1 10 0 20 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and
a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding
eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If
λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i ,
then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
100
If λ2 = 2 + i , then
(A− λ1I) v =
3− λ 1 10 2− λ 10 −1 2− λ
v1v2v3
=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
3− (2 + i) 1 10 2− (2 + i) 10 −1 2− (2 + i)
v1v2v3
=
1− i 1 10 −i 10 −1 −i
v1v2v3
=
1− i 1 10 −i 10 0 0
v1v2v3
=
1− i 0 1− i0 −i 10 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
3− (2 + i) 1 10 2− (2 + i) 10 −1 2− (2 + i)
v1v2v3
=
1− i 1 10 −i 10 −1 −i
v1v2v3
=
1− i 1 10 −i 10 0 0
v1v2v3
=
1− i 0 1− i0 −i 10 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
3− (2 + i) 1 10 2− (2 + i) 10 −1 2− (2 + i)
v1v2v3
=
1− i 1 10 −i 10 −1 −i
v1v2v3
=
1− i 1 10 −i 10 0 0
v1v2v3
=
1− i 0 1− i0 −i 10 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
3− (2 + i) 1 10 2− (2 + i) 10 −1 2− (2 + i)
v1v2v3
=
1− i 1 10 −i 10 −1 −i
v1v2v3
=
1− i 1 10 −i 10 0 0
v1v2v3
=
1− i 0 1− i0 −i 10 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
3− (2 + i) 1 10 2− (2 + i) 10 −1 2− (2 + i)
v1v2v3
=
1− i 1 10 −i 10 −1 −i
v1v2v3
=
1− i 1 10 −i 10 0 0
v1v2v3
=
1− i 0 1− i0 −i 10 0 0
v1v2v3
=
000
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and
a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding
eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+
i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding
solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions
of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation
are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ;
x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(2) =
10−1
+ i
010
The corresponding solutions of the differential equation are
x(1) =
100
e3t ; x(2) = e2t
10−1
cos(t)−
010
sin(t)
x(3) = e2t
10−1
cos(t) +
010
sin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian
of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions
is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)=
e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t
6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these solutions is
W [x(1), x(2), x(3)](t) =
∣∣∣∣∣∣e3t e2tcos(t) e2tsin(t)0 −e2tsin(t) e2tcos(t)0 −e2tcos(t) −e2tsin(t)
∣∣∣∣∣∣ =
e3te2te2t
∣∣∣∣∣∣1 cos(t) sin(t)0 −sin(t) cos(t)0 −cos(t) −sin(t)
∣∣∣∣∣∣ =
e3te2te2t(sin2(t) + cos2(t)
)= e7t 6= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence,
the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions
x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and
x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3)
form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and
the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution
of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system
is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X =
c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X =
c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t +
c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Hence, the solutions x(1), x(2) and x(3) form a fundamental set,and the general solution of the system is
X = c1x(1) + c2x(2) + c3x(3) =⇒
X = c1
100
e3t + c2
10−1
e2tcos(t)−
010
e2tsin(t)
+
c3
10−1
e2tcos(t) +
010
e2tsin(t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is
the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field
associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with
the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
X =
x1x2x3
=
c1e3t + e2t(c2cos(t) + c3sin(t))
0 e2t(−c2sin(t) + c3cos(t))0 −e2t(c2cos(t) + c3sin(t))
Here is the direction field associated with the system
x ′1x ′2x ′3
=
3 1 10 2 10 −1 2
x1x2x3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve
the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find
the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues
of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix
A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 =
(λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Example 9.7
Solve the following ODE
X′ = AX =
(−1/2 1−1 −1/2
)X
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣−1/2− λ 1−1 −1/2− λ
∣∣∣∣ = 0
(−1/2− λ)2 + 1 = (λ)2 + λ+5
4= 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i ,
λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If
λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i ,
then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=
(−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
λ1 = −1
2+ i , λ2 = −1
2− i
If λ1 = −12 + i , then
(A− λ1I) x =
(−1/2− λ 1−1 −1/2− λ
)(v1v2
)=
(−1/2− (−1
2 + i) 1−1 −1/2− (−1
2 + i)
)(v1v2
)=(
−i 1−1 −i
)(v1v2
)=
(00
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and
a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding
eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)
If λ2 = −12 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If
λ2 = −12 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i ,
then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and a corresponding eigenvector is
v(1) =
(1i
)If λ2 = −1
2 − i , then
(A− λ1I) x =
(−1/2− (−1
2 − i) 1−1 −1/2− (−1
2 − i)
)(v1v2
)=
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and
a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding
eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)
The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions
of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation
are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ;
x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
(i 1−1 i
)(v1v2
)=
and a corresponding eigenvector is
v(2) =
(1−i
)The corresponding solutions of the differential equation are
x(1) =
(1i
)e(−1/2+i)t ; x(2) =
(1
− i
)e(−1/2−i)t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain
a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of
real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions,
we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose
the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand
imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts
of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or
x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2).
In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+
i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)
Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence,
a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of
real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued
solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions
are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)
v(t) = e−t/2(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
To obtain a set of real-valued solutions, we can choose the realand imaginary parts of either x (1) or x (2). In fact,
x(1) =
(1i
)e(−1/2+i)t =
(1i
)e−t/2 (cos(t) + i sin(t)) =
(e−t/2cos(t)
−e−t/2sin(t)
)+ i
(e−t/2sin(t)
e−t/2cos(t)
)Hence, a set of real-valued solutions are
u(t) = e−t/2(
cos(t)−sin(t)
)v(t) = e−t/2
(sin(t)cos(t)
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian
of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two
real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions
is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t
6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence,
the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions
x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1),
x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2)
form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a
fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
The Wronskian of these two real-valued solutions is
W [x(1), x(2)](t) =
∣∣∣∣ e−t/2cos(t) e−t/2sin(t)
−e−t/2sin(t) e−t/2cos(t)
∣∣∣∣ =
e−t/2e−t/2∣∣∣∣ cos(t) sin(t)−sin(t) cos(t)
∣∣∣∣ = e−t 6= 0
Hence, the solutions x(1), x(2) form a fundamental set,
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and
the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution
of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system
is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X =
c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) =
c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+
c2e−t/2
(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)
Here is the direction field associated with the system(x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is
the direction field associated with the system(x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field
associated with the system(x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with
the system(x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system
(x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
and the general solution of the system is
X = c1x(1) + c2x(2) = c1e−t/2
(cos(t)−sin(t)
)+ c2e
−t/2(sin(t)cos(t)
)Here is the direction field associated with the system(
x ′1x ′2
)=
(−1/2 1−1 −1/2
)(x1x2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Complex Eigenvalues
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude
our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration
of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous system
with constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion
of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case
in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix
A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A
has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues.
suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that
λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ
is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root
of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the
characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation
∣∣∣A− λI∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then,
λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ
is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue
of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity
2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2
of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA.
In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event,
there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are
two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities:
The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A
isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and
there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and
there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still
a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions
ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,
weλt}
.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
We conclude our consideration of the linear homogeneous systemwith constant coefficients
x′ = Ax
with a discussion of the case in which the matrix A has a repeatedeigenvalues. suppose that λ is a repetead root of the characteristicequation ∣∣∣A− λI
∣∣∣ = 0
Then, λ is an eigenvalue of algebraic multiplicity 2 of the matrixA. In this event, there are two possibilities: The matrx A isnon-defective and there is still a fundamental set of solutions ofthe form
{veλt ,weλt
}.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However,
if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A
is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective,
there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is
just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution
ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form
veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt
associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with
this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue.
Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore,
toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution,
it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary
to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find
other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solution
of a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that
a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation
occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for
the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0
when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation
had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r .
In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case
we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found
one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution
y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,
but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second
independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution
had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form
y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way,
it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural
to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find
a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a second
independent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution
of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
However, if the matrx A is defective, there is just one solution ofthe form veλt associated with this eigenvalue. Therefore, toconstruct the general solution, it is necessary to find other solutionof a different form.
Recall that a similar situation occurred for the linear equationay ′′ + by ′ + cy = 0 when the characteristic equation had a doubleroot r . In that case we found one exponential solution y1(t) = ert ,but a second independent solution had the form y2(t) = tert
In this way, it may be natural to attempt to find a secondindependent solution of the form
x = wteλt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but,
doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and
substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system
we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find that
w = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus,
we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and
substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting
this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x
in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system
we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find
the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation
is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved
with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and
only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one
is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining
to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
but, doing this and substituting x in the system we find thatw = 0. Thus, we propose
x = wteλt + ueλt
and substituting this new x in the system we find the system
(A− λI) w = 0
(A− λI) u = w
The first equation is already solved with w = v and only thesecond one is remaining to be solved.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find
the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution
of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find
the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues
of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix
A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Example 9.8
Find the solution of the system
x′ = Ax =
(1 −11 3
)x
Solution
Let’s find the eigenvalues of the matrix A
|A− λI| =
∣∣∣∣1− λ −11 3− λ
∣∣∣∣ = 0
(λ− 1)(λ− 3) + 1 = 0 =⇒ (λ− 2)2 = 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2,
λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If
λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2,
then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)
and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and
a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding
eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
λ1 = 2, λ2 = 2,
If λ1,2 = 2, then
(A− λ1,2I) v =
(1− λ −1
1 3− λ
)(v1v2
)=
(−1 −11 1
)(v1v2
)=
(00
)and a corresponding eigenvector is
v(1) =
(1
− 1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and
the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution
is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now,
for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second
solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution
we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t +
ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where
u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u
satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u =
(A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u =
v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
and the solution is
x(1) =
(1
− 1
)e2t
Now, for the second solution we propose
x(2) = vte2t + ue2t
where u satisfies
(A− λI) u = (A− 2I) u = v
(−1 −11 1
)(u1u2
)=
(v1v2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so
if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k ,
where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k
is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary,
then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1.
If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)
then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then
by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting
for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and
u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u,
we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t +
ke2t(
1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
we have
−u1 − u2 = 1
so if u1 = k , where k is arbitrary, then u2 = −k − 1. If we write
u =
(k
−1− k
)=
(0−1
)+ k
(1−1
)then by substituting for w and u, we obtain
x(2) =
(1−1
)te2t +
(0−1
)e2t + ke2t
(1−1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above
is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely
a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple
of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solution
x (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and
may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored,
but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but
the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first
two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms
constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute
anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation
shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) =
− e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and
therefore{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}
form a fundamental set of solutions ofthe system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set
of solutions ofthe system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions
ofthe system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system.
The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution
is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x =
c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) +
c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
The last term above is merely a multiple of the first solutionx (1)(t) and may be ignored, but the first two terms constitute anew solution:
x(2) =
(1−1
)te2t +
(0−1
)e2t
An elementary calculation shows that W [x (1), x (2)](t) = − e4t 6= 0and therefore
{x (1), x (2)
}form a fundamental set of solutions of
the system. The general solution is
x = c1x(1) + c2x(2) = c1
(1−1
)e2t + c2
((1−1
)te2t +
(0−1
)e2t)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again
the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and
suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that
r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ
is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue
of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A,
but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one
corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v.
Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then
one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλt
where v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere
v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v
satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and
a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution
is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where
u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u
satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = v
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Consider again the system
x′ = Ax
and suppose that r = λ is a double eigenvalue of A, but, there isonly one corresponding eigenvector v. Then one solution is
x(1)(t) = veλtwhere v satisfies
(A− λI) v = 0
and a second solution is given by
x(2)(t) = vteλt + ueλt
where u satisfies
(A− λI) u = vDr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though
|A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0,
it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that
it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible
to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it
for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u
( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually,
there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are
infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .
Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now,
Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation,
together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with
the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation
for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,
we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI)
[(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u =
(A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u =
0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector
u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u
is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as
a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Repeated Eigenvalues
Even though |A− λI| = 0, it can be shown that it is alwayspossible to solve it for u ( Actually, there are infinetly solutions ) .Now, Using the above equation, together with the equation for v,we get
(A− λI) [(A− λI) u = v]
(A− λI)2 u = (A− λI) v
(A− λI)2 u = 0
The vector u is known as a generalized eigenvector.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason
why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system
of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear
(algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)
equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations
presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty
is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that
the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations
are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usually
coupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence,
the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations
in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system
must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved
simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.
On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary,
if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system
is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled,
then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equation
can be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved
independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently
of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming
the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system
into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent
uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem
( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which
each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation
contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only
one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable )
corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds
to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming
the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix
A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A
intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa
diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix.
Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors
are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful
in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing
such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Diagonalizable Matrices.
The basic reason why a system of linear (algebraic or differential)equations presents some difficulty is that the equations are usuallycoupled.
Hence, the equations in the system must be solved simultaneously.On the contrary, if the system is uncoupled, then each equationcan be solved independently of all the others.
Transforming the coupled system into an equivalent uncoupledsystem ( in which each equation contains only one unknownvariable ) corresponds to transforming the coefficient matrix A intoa diagonal matrix. Eigenvectors are useful in accomplishing such atransformation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume
that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix
A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A
has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n)
linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and
considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Let’s assume that the matrix A has n eigenvectors x(1), x(2), ...,x(n) linearly indepedent, then
Ax(1) = λ1x(1); Ax(2) = λ2x(2); ...Ax(n) = λnx(n)
and considering the matrix
U =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
we have
AU =
Ax(1) · · · Ax
(n)1
......
... · · ·...
=
λ1x
(1)1 · · · λnx
(n)1
λ1x(1)2
...
λ1x(1)n λnx
(n)n
= UD
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
we have
AU =
Ax(1) · · · Ax
(n)1
......
... · · ·...
=
λ1x
(1)1 · · · λnx
(n)1
λ1x(1)2
...
λ1x(1)n λnx
(n)n
= UD
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
we have
AU =
Ax(1) · · · Ax
(n)1
......
... · · ·...
=
λ1x
(1)1 · · · λnx
(n)1
λ1x(1)2
...
λ1x(1)n λnx
(n)n
= UD
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
we have
AU =
Ax(1) · · · Ax
(n)1
......
... · · ·...
=
λ1x
(1)1 · · · λnx
(n)1
λ1x(1)2
...
λ1x(1)n λnx
(n)n
= UD
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
we have
AU =
Ax(1) · · · Ax
(n)1
......
... · · ·...
=
λ1x
(1)1 · · · λnx
(n)1
λ1x(1)2
...
λ1x(1)n λnx
(n)n
=
UD
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
we have
AU =
Ax(1) · · · Ax
(n)1
......
... · · ·...
=
λ1x
(1)1 · · · λnx
(n)1
λ1x(1)2
...
λ1x(1)n λnx
(n)n
= UD
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D
is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose
diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements
are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues
of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A.
From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations
we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that
it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus,
if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and
eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors
of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A
are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known,
A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed
into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a
diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix
by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process
shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in
theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
where D is the diagonal matrix
D =
λ1
λ2. . .
λn
whose diagonal elements are the eigenvalues of A. From the lastequations we have that it follows that
U−1AU = D ⇐⇒ A = UDU−1
Thus, if the eigenvalues and eigenvectors of A are known, A canbe transformed into a diagonal matrix by the process shown in theabove equation.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process
is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as
a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.
Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively,
we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say
that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A
is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If
A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A
is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian
( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ),
then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination
of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1
isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple.
The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n)
of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A
are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known
to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to be
mutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal,
so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us
choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them
so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that
they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are
alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized
by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1
for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i .
It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy
verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify that
U−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗.
In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words,
the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse
of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U
is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same
as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint
(the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose
of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its
complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally,
we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note
that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A
has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer
than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n
linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors,
then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no
matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U
such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that
U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D.
Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case,
A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A
is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not
diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
This process is known as a similarity transformation.Alternatively, we may say that A is diagonalizable.
If A is Hermitian ( A = (A∗)T ), then the determination of U−1 isvery simple. The eigenvectors v(1), ..., v(n) of A are known to bemutually orthogonal, so let us choose them so that they are alsonormalized by < v(i), v(i) >= 1 for each i . It is easy verify thatU−1 = U∗. In other words, the inverse of U is the same as itsadjoint (the transpose of its complex conjugate).
Finally, we note that if A has fewer than n linearly independenteigenvectors, then there is no matrix U such that U−1AU = D. Inthis case, A is not diagonalizable.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t)
form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set
ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions
on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval
α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β.
Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns
are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors
x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t),
is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be
afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix
for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system.
Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions
is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent
the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix
is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Fundamental Matrices
Let’s start with the system
x′ = P(t)x
Suppose that x(1)(t), ..., x(n)(t) form a fundamental set ofsolutions on some interval α < t < β. Then the matrix
Ψ(t) =
x(1)1 · · · x
(n)1
......
x(1)n · · · x
(n)n
whose columns are the vectors x(1)(t), ..., x(n)(t), is said to be afundamental matrix for the linear system. Since the set ofsolutions is linearly independent the matrix is nonsingular.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus,
for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example,
a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix
for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed
from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions
x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and
x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ;
x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, for example, a fundamental matrix for the system
x′ = Ax =
(1 −11 3
)x
can be formed from the solutions x (1)(t) and x (2)(t):
x(1) =
(1
− 1
)e2t ; x(2) =
(1−1
)te2t +
(0−1
)e2t
then
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)=
e2t(
1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)
Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column
of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t)
is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE.
It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t)
satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular,
the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = I
is called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and
can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be found
from the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation
Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) =
Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)
Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
then
Ψ(t) =
(e2t te2t
−e2t −te2t − e2t
)= e2t
(1 t−1 −1− t
)Recall that each column of the fundamental matrix Ψ(t) is asolution of the ODE. It follows that Ψ(t) satisfies the matrixdifferential equation
Ψ′ = P(t)Ψ
In particular, the fundamental matrix Φ(t) that satisfies Φ(0) = Iis called the special fundamental matrix and can also be foundfrom the relation Φ(t) = Ψ(t)Ψ−1(0).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)
=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) =
Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) =
e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
)
(1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) =
e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)
The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix
is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known
as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the
exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, in this case
Ψ(0) =
(1 0−1 −1
)=⇒ Ψ−1(0) =
(1 0−1 −1
)
Φ(t) = Ψ(t)Ψ−1(0) = e2t(
1 t−1 −1− t
) (1 0−1 −1
)
Φ(t) = e2t(
1− t −tt 1 + t
)The latter matrix is also known as the exponential matrix eAt .
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem
for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
The Matrix eAt
Recall that the solution of the initial value problem
x ′ = ax , x(0) = x0, a = constant
is given by
x(t) = x0eat
Now, consider the corresponding initial value problem for an n × nsystem
x′ = Ax, x(0) = x0
where A is a constant matrix.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained,
we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I.
Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat .
let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat
can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t.
Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and
consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Applying the results of already obtained, we can write its solutionas
x = Φ(t)x0
where Φ(0) = I. Thus, Φ(t), is playing the roll of eat . let’s seethis with more detail.
The scalar exponential function eat can be represented by thepower series
eat = 1 +∞∑n=1
antn
n!
which converges for all t. Let us now replace the scalar a by then × n constant matrix A and consider the corresponding series
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!=
I + At +A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I +
At +A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+
...+Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+
...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix.
It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat
each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges
for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞.
Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix,
which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series
term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term,
we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=
∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!=
A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]=
AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
I +∞∑n=1
Antn
n!= I + At +
A2t2
2!+ ...+
Ant2
n!+ ...
Each term in the series is an n × n matrix. It is possible to showthat each element of this matrix sum converges for all t asn→∞. Thus, we have a well defined n × n matrix, which will bedenote by eAt
eAt = I +∞∑n=1
Antn
n!
By differentiating the above series term by term, we obtain
d
dt
[eAt]
=∞∑n=1
Antn−1
(n − 1)!= A
[I +
∞∑n=1
Antn
n!
]= AeAt
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0
in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt
we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way,
we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that
the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φ
satisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as
eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Therefore, eAt satisfies the differential equation
d
dt
[eAt]
= AeAt
Further, by setting t = 0 in the definition of eAt we find that eAt
satisfies the initial condition
eAt∣∣∣t=0
= I
In this way, we have that the special fundamental matrix Φsatisfies the same initial value problem as eAt , namely,
Φ′ = AΦ, Φ(0) = I
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP
(extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations),
we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and
the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix
Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t)
are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same.
Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus,
we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write
the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution
of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial
value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Then, by uniqueness of an IVP (extended to matrix differentialequations), we conclude that eAt and the special fundamentalmatrix Φ(t) are the same. Thus, we can write the solution of theinitial value problem
x = Ax, x(0) = x0
in the form
x = eAtx0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An
n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix
A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A
can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized
only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if
it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has
a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement
of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of
n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n
linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent
eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors
(because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues),
then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed
into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called
its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Jordan Canonical Forms
An n × n matrix A can be diagonalized only if it has a fullcomplement of n linearly independent eigenvectors.
If there is a shortage of eigenvectors (because of repeatedeigenvalues), then A can always be transformed into a nearlydiagonal matrix called its Jordan form.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form,
J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J,
has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues
of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A
on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the
main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,
ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions
on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal
above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and
zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3
. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
A Jordan form, J, has the eigenvalues of A on the main diagonal,ones in certain positions on the diagonal above the main diagonal,and zeros elsewhere.
J(t) =
λ1 10 λ1 10 0 λ1
λ2 10 λ2
λ3. . .
λn 10 λn
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again
the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A
given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform
A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into
its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form,
we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct
thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix
U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U
with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector
v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v
in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and
the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector
u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 )
in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column.
Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then
U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and
its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse
are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)
U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)
It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J =
U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
)
(1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
)
(1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Consider again the matrix A given by the equation
x′ = Ax =
(1 −11 3
)x
To transform A into its Jordan form, we construct thetransformation matrix U with the single eigenvector v in its firstcolumn and the generalized eigenvector u ( k = 0 ) in thesecond column. Then U and its inverse are given by
U =
(1 0−1 −1
)U−1 =
(1 0−1 −1
)It follows that
J = U−1AU =
(1 0−1 −1
) (1 −11 3
) (1 0−1 −1
)=
(2 10 2
)Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally,
If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation
x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy
where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U
is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above,
produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2,
y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t ,
y1 = c1te2t + c2e
2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Finally, If we start again from
x′ = Ax =
(1 −11 3
)x
the transformation x = Uy where U is given above, produces thesystem
y′ = Jy
y ′1 = 2y1 + y2, y ′2 = 2y2
y2 = c1e2t , y1 = c1te
2t + c2e2t
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus,
two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions
of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ;
y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding
fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)
Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I,
we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also
identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt .
To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtain
a fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix
for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system,
we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form
theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) =
UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)
which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is t
he same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same
as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the
fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix
that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
Complex EigenvaluesRepeated EigenvaluesDiagonalization
Diagonalization.
Thus, two independent solutions of the y−system are
y(1)(t) =
(10
)e2t ; y(2)(t) =
(t1
)e2t
and the corresponding fundamental matrix is
Ψ̂(t) =
(e2t te2t
0 e2t
)Since Ψ̂(0) = I, we can also identify this matrix as eJt . To obtaina fundamental matrix for the original system, we now form theproduct
Ψ(t) = UeJt =
(e2t te2t
−e2t −e2t − te2t
)which is the same as the fundamental matrix that we obtainedbefore.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section,
we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A
is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrix
with m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n.
(This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption
is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for
convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only;
allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results
will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold
if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present
a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method
for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close
A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is
to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix
of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank.
The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves
factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A
into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct
UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T ,
where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is
an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m
orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix,
V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is an
n × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n
orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and
Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is
an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix
whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries
are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and
whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal
elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Singular Value DecompositionIn this section, we assume throughout that A is an m × n matrixwith m ≥ n. (This assumption is made for convenience only; allthe results will also hold if m < n).
We will present a method for determining how close A is to amatrix of smaller rank. The method involves factoring A into aproduct UΣV T , where U is an m ×m orthogonal matrix, V is ann × n orthogonal matrix, and Σ is an m × n matrix whoseoff-diagonal entries are all 0′s and whose diagonal elements satisfy
σ1 ≥ σ2 ≥ · · · ≥ σn ≥ 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined
by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization
are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and
are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are called
the singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values
of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization
UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T
is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called
thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition
of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or,
for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short,
the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Σ =
σ1σ2 . . .
σn
The σ′s determined by this factorization are unique and are calledthe singular values of A. The factorization UΣV T is called thesingular value decomposition of A, or, for short, the SVD of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A
is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an
m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then
A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has
a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA
is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric
n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues
of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA
are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and
it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has
an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonal
diagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix
V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore,
its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues
must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be
nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see
this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point,
let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ
be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue
of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and
x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x
be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector
belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ.
It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
The SVD Theorem
If A is an m × n matrix, then A has a singular value decomposition
Sketch of the proof
ATA is a symmetric n × n matrix.
The eigenvalues of ATA are all real and it has an orthogonaldiagonalizing matrix V .
Furthermore, its eigenvalues must all be nonnegative.
To see this point, let λ be an eigenvalue of ATA and x be aneigenvector belonging to λ. It follows that
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 =
xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx =
xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx =
λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =
||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume
that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns
of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V
have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered
sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat
the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding
eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfy
λ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0.
The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values
are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =
√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj ,
j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
||Ax||2 = xTATAx = xTλx = λxTx = λ||x||2 ⇒
λ =||Ax||2
||x||2
We may assume that the columns of V have been ordered sothat the corresponding eigenvalues satisfyλ1 ≥ λ2 ≥ · · · ≥ λn ≥ 0. The singular values are given by
σj =√λj , j = 1, 2, ..., n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote
the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank
of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A.
The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix
ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA
will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also have
rank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r .
Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA
is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric,
its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals
the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof
nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues.
Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0
σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now
let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and
V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors
of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1
are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of
ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA
belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto
λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors
of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2
are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors
of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA
belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto
λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Let r denote the rank of A. The matrix ATA will also haverank r . Since ATA is symmetric, its rank equals the numberof nonzero eigenvalues. Thus,
σ1 ≥ σ2 ≥ · · · ≥ σr > 0 σr+1 = σr+2 = · · · = σn = 0
Now let V1 = (v1, v2, ...., vr , ) and V2 = (vr+1, vr+2, ...., vn, )
The column vectors of V1 are eigenvectors of ATA belongingto λi , i = 1, 2, ..., r .
The column vectors of V2 are eigenvectors of ATA belongingto λj = 0, j = r + 1, r + 2, ..., n.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now
let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1
be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix
defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix
Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ
is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Now let Σ1 be the r × r matrix defined by
Σ1 =
σ1σ2 . . .
σn
The m × n matrix Σ is then given by
Σ =
(Σ1 00 0
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
To complete the proof, we must show how to construct anm ×m orthogonal matrix U such that
A = UΣV T
AV = UΣ
Comparing the first r columns of each side of the lastequation, we see that
Avi = σivi , i = 1, 2, ..., r
Thus, if we define
ui =1
σiAvi , i = 1, 2, ..., r
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors
of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1
form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set.
Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,
form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal
basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for
R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A).
The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector space
R(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT )
has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension
m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r .
Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let
{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un}
be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis
for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
and
U1 = (u1,u2, ...,ur )
then it follows that
AV1 = U1Σ1
The column vectors of U1 form an orthonormal set. Thus,form an orthonormal basis for R(A). The vector spaceR(A)⊥ = N(AT ) has dimension m − r . Let{ur+1,ur+2, · · · ,un} be an orthonormal basis for N(AT ) andset
U2 = (ur+1,ur+2, ...,un)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A =
UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = U
ΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣ
V T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T
�
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Setthe m × n matrix U by
U = (U1,U2)
The the matrices U,Σ, and V satisfy
A = UΣV T �
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
Observations
Let A be an m × n matrix with a singular value decompositionA = UΣV T .
The singular values σ1, ..., σn of A are unique; however, thematrices U and V are not unique.
Since V diagonalizes ATA, it follows that the v′js are
eigenvectors of ATA.
Since AAT = UΣΣTUT , it follows that U diagonalizes AAT
and that the u′js are eigenvectors of AAT .
The v′js are called the right singular vectors of A, and the u′jsare called the left singular vectors of A.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
If A has rank r , then
(i) v1, v2, ..., vr form an orthonormal basis for R(AT ).
(ii) vr+1, vr+2, ..., vn form an orthonormal basis for N(A).
(iii) u1,u2, ...,ur form an orthonormal basis for R(A).
(iv) ur+1,ur+2, ...,ur+n form an orthonormal basis for N(AT )
The rank of the matrix A is equal to the number of itsnonzero singular values (where singular values are countedaccording to multiplicity).
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr )
U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A =
U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1
Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1
V T1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization,
is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called
the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition
of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form
is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful
inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Introduction
In the case that A has rank r < n, if we set
V1 = (v1, v2, ..., vr ) U1 = (u1,u2, ...,ur )and define Σ1 as before, then
A = U1Σ1VT1
This factorization, is called the compact form of thesingular value decomposition of A. This form is useful inmany applications
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7
let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute
the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and
the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decomposition
of A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.7let
A =
1 11 10 0
Compute the singular values and the singular value decompositionof A
Solution
The matrix
ATA =
(2 22 2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues
λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.
Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently,
the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues
of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are
σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and
σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0
The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue
λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1
haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors
of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form
α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and
σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors
of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform
β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T .
Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore,
the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =
1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 4 and λ2 = 0.Consequently, the singularvalues of A are σ1 = 2 and σ2 = 0 The eigenvalue λ1 haseigenvectors of the form α(1, 1)T , and σ2 has eigenvectors of theform β(1, 1)T . Therefore, the orthogonal matrix
V =1√2
(1 11 −1
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes
ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA.
From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before,
it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining
column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors
of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U
must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form
an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormal
basis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for
N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ).
We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute
a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis
{x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3}
for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT )
inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T ,
x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since
these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors
are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal,
it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary
touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse
the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process
to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain
an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
diagonalizes ATA. From what we discussed before, it follows that
u1 =1
σ1Av1 =
1
2
1 11 10 0
(1/√
2
1/√
2
)=
1/√
2
1/√
20
The remaining column vectors of U must form an orthonormalbasis for N(AT ). We can compute a basis {x2, x3} for N(AT ) inthe usual way.
x2 = (1,−1, 0)T , x3 = (0, 0, 1)T
Since these vectors are already orthogonal, it is not necessary touse the Gram-Schmidt process to obtain an orthonormal basis.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only
set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =
1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 =
(1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 =
(0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then
follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A =
UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 00 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
We need only set
u2 =1
||x2||x2 = (
1√2,− 1√
2, 0)T , u3 =
1
||x3||x3 = (0, 0, 1)T
It then follows that
A = UΣV T =
1√2
1√2
01√2− 1√
20
0 0 1
2 0
0 00 0
(1√2
1√2
1√2− 1√
2
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8
Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute
the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and
the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decomposition
of A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.8Let
A =
(3 2 22 3 −2
)
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
(17 88 17
)Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues
λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9.
Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently,
the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues
of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are
σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and
σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and
σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors
(the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now
we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find
the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors
(the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )
byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding
an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors
of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible
to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding
the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors
(columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU)
instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead.
The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues
of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA
are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and
sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric
we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that
the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25,
we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
has eigenvalues λ1 = 25 and λ2 = 9. Consequently, the singularvalues of A are σ1 = 5 and σ2 = 3, and from here we can find theleft singular vectors (the columns of U ) but we’ll do it in adifferent way at the end.
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA. It is alsopossible to proceed by finding the left singular vectors (columns ofU) instead. The eigenvalues of ATA are 25, 9, and 0, and sinceATA is symmetric we know that the eigenvectors will beorthogonal.
For λ = 25, we have
ATA− 25I =
−12 12 212 −12 −22 −2 −17
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector
in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =
1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9,
we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 −1 00 0 10 0 0
A unit-length vector in the kernel of that matrix
v1 =1√2
110
For λ = 9, we have
ATA− 9I =
4 12 212 4 −22 −2 −1
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector
in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =
1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector,
we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute
the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA or
find a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector
perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2.
To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicular
to v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c),
must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that
−a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and
to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular
to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2,
vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c),
must satisfy that2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that
2√18
+ 4c√18
= 0.
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
which row-reduces to 1 0 −1/40 1 1/40 0 0
A unit-length vector in the kernel of that matrix
v2 =1√18
1−14
For the last eigenvector, we could compute the kernel of ATA orfind a unit vector perpendicular to v1 and v2. To be perpendicularto v1, vT3 = (a, b, c), must satisfy that −a = b and to beperpendicular to v2, vT3 = (a,−a, c), must satisfy that2√18
+ 4c√18
= 0.Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =
1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it
to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length
we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =
1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point
we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
So
v2 =1√18
a−a−a/2
and for it to be unit-length we need a = 2/3
v3 =1√18
2/3−2/3−1/3
So at this point we know that
A = UΣV T = U
(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally,
we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U
by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi or
ui = 1σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi .
This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)
So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So
the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =
(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Finally, we can compute U by the formula σui = Avi orui = 1
σAvi . This gives
U =
(1/√
2 1/√
2
1/√
2 −1/√
2
)So the full SVD is:
A = UΣV T =(1/√
2 1/√
2
1/√
2 −1/√
2
)(5 0 00 3 0
) 1/√
2 1/√
2 0
1/√
18 −1/√
19 4/√
182/3 −2/3 −1/3
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A
has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition
UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T ,
then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A
can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented
by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the
outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix
of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k ,
is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by
truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,
after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first
k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
OBS
If A has singular value decomposition UΣV T , then A can berepresented by the outer product expansion
A = σ1u1vT1 + σ2u2vT2 + · · ·+ σnunvTn
The closest matrix of rank k , is obtained by truncating this sum,after the first k terms:
A′ = σ1u1vT1 + σ1u2vT2 + · · ·+ σkukvTk , k < n
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9
Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute
the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and
the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decomposition
of A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Example 12.9Let
A =
0 1 1√2 2 0
0 1 1
Compute the singular values and the singular value decompositionof A.
Solution
The matrix
AAT =
2 2 22 6 22 2 2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ =
−λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) =
−λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues
λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8,
λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and
λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.
Consequently, the singular values of A are σ1 = 2√
2, σ2 =√
2 andσ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently,
the singular values of A are σ1 = 2√
2, σ2 =√
2 andσ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values
of A are σ1 = 2√
2, σ2 =√
2 andσ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are
σ1 = 2√
2, σ2 =√
2 andσ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2,
σ2 =√
2 andσ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
The characteristic polynomial is
−λ3 + 10λ2− 16λ = −λ(λ2− 10λ+ 16) = −λ(λ− 8)(λ− 2) = 0
and AAT has eigenvalues λ1 = 8, λ2 = 2 and λ3 = 0.Consequently, the singular values of A are σ1 = 2
√2, σ2 =
√2 and
σ3 = 0
To give the decomposition, we consider the diagonal matrix ofsingular values
Σ =
2√
2 0 0
0√
2 00 0 0
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues
of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT
are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and
since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric
weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that
the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal.
This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
;
u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Next, we find an orthonormal set of eigenvectors for AAT . Theeigenvalues of AAT are 8, 2, and 0, and since AAT is symmetric weknow that the eigenvectors will be orthogonal. This giveseigenvectors
u1 =
1√62√61√6
; u2 =
−1√3
1√3
− 1√3
u3 =
1√2
0− 1√
2
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now
we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find
the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors
(the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )
byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding
an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors
of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues
of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA
are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and
since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric
we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that
the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal.
Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
U =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
Now we find the right singular vectors (the columns of V )byfinding an orthonormal set of eigenvectors of ATA.
ATA =
2 2√
2 0
2√
2 6 20 2 2
The eigenvalues of ATA are 8, 2, and 0, and since ATA issymmetric we know that the eigenvectors will be orthogonal. Thisgives eigenvectors
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
v1 =
1√63√121√12
v2 =
1√3
02√6
v3 =
1√2
−12
12
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally, we can now we verify that we have A = UΣV T
A = UΣV T =1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally, we can now we verify that we have A = UΣV T
A = UΣV T =1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally, we can now we verify that we have A = UΣV T
A = UΣV T =1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally,
we can now we verify that we have A = UΣV T
A = UΣV T =1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally, we can now we verify that we have A = UΣV T
A = UΣV T =1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally, we can now we verify that we have A = UΣV T
A = UΣV T =
1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
Abstract Linear Algebra ISingular Value Decomposition (SVD)
SVD. IntroductionSVD. Examples
SVD. Examples
Put these together to get
V =
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Finally, we can now we verify that we have A = UΣV T
A = UΣV T =1√6− 1√
31√2
2√6
1√3
01√6− 1√
3− 1√
2
2√
2 0 0
0√
2 00 0 0
1√6
1√3
1√2
3√12
0 −12
1√12− 2√
612
Dr. Marco A Roque Sol Linear Algebra. Session 9
top related