chap04t

41
Chapter 4: Eigenvalues and Eigenvectors Recall that a linear transformation is a rule that makes one vector into another vector. Throughout this discussion, we will be dealing with transformations from a space into itself, so all matrices will be square. x Ax

Upload: sandeepparuchuri

Post on 01-Feb-2016

4 views

Category:

Documents


0 download

DESCRIPTION

eigen vectors

TRANSCRIPT

Page 1: chap04t

Chapter 4: Eigenvalues and Eigenvectors

Recall that a linear transformation is a rule that makes one vector into another vector.

Throughout this discussion, we will be dealing with transformations from a space into itself, so all matrices will be square.

x

Ax

Page 2: chap04t

In special situations, when A operates on x, we get a scaled version of x back again:

x

Ax

So we have Ax x= λ

This only happens for special 's and special 's.

Ax x= λWhen

xλ "Eigenvalue"

"Eigenvector"

Page 3: chap04t

For an n x n matrix A, there are:

l n eigenvalues, some of which may be complex and/or repeated.

l At least one eigenvector correspondingto each distinct eigenvalue. These will be complex if the eigenvalues are complex.

l Sometimes repeated eigenvalues have associated with them generalizedeigenvectors.

l n eigenvalues, some of which may be complex and/or repeated.

l At least one eigenvector correspondingto each distinct eigenvalue. These will be complex if the eigenvalues are complex.

l Sometimes repeated eigenvalues have associated with them generalizedeigenvectors.

Page 4: chap04t

How to find eigenvalues and eigenvectors:

Ax x= λ

( )A I x− =λ 0

We know from the previous chapter that this system of linear equations will have a solution (other than zero) whenever

)( IANx λ−∈

For this to happen, the degeneracy of must be at least one. Equivalently, the rank of the matrix must be less than full (n).

A I− λA I− λ

Page 5: chap04t

When the rank of a square matrix is deficient, then the determinant of that matrix is zero.

So A I I A− = = −λ λ0 ( )

If we do this computation and solve for then we get the eigenvalues. We will always get an n th order polynomial in , so its n roots will be the eigenvalues .

We then solve the linear equation to get the corresponding eigenvectors.

λ

λnii ,,1, K=λ

Example: Find the eigenvalues and eigenvectors of

−−=122230

203A

( )A I x− =λ 0

Page 6: chap04t

[ ] [ ]

0)1)(3)(5(1577

)3(224)1)(3()3(122

230203

23

=λ−−λ−λ−=−λ−λ+λ−=

λ−−+−λ−λ−λ−=

λ−−−λ−

λ−=λ− IA

So the three eigenvalues are: λ λ λ1 2 35 3 1= = = −, ,

Now find the eigenvector associated with :λ1

0422220

202)5()( 11151

=

−−−−

−=−=λ− =λ=λ xxIAxIA

How would you solve this?

F

This happens to be the characteristic equationof the system; theeigenvalues are the poles!

Page 7: chap04t

Using your favorite method; e.g., Gaussian elimination, echelon forms, etc., we can get:

−=11

1

1x

Because the equation can be multiplied on both sides by an arbitrary constant and still be true, any scalar multiple of an eigenvector is still an eigenvector. We often express eigenvectors with such a constant:

Ax x= λ

−=cc

cx1

Page 8: chap04t

Better yet, we normalize all eigenvectors so that they all have length 1:

−=

31

31

31

1x

Identical procedures give us the other two eigenvectors:

(corres. to )λ1 5=

) to (corres. 3

022

12

1

2 =λ

=x

) to (corres. 13

62

61

61

3 −=λ

−−=x

Page 9: chap04t

What will happen if we use the eigenvectors as the basis of the vector space that x belongs to?

Pause here to explore an application for theeigenvalues/vectors: Consider the previous example.

Note that the eigenvectors are linearly independent. They therefore form a basis for a 3-D vector space.

Suppose we had a (homogeneous) system:

Axx =&

no input

Page 10: chap04t

Let denote the state vector in the new basis ( denotes the old state vector). The relationship between and is:

where is a matrix whose columns are the new basis vectors.

When is formed by columns that are eigenvectors, it is called a modal matrix.

xx

x

x Mx=M

x

M

Proceeding, xMxxMx && == so

Axx =& xAMxM =&

F

(substitute)

Page 11: chap04t

xAMMx 1−=&

Recall that is called a similarity transform on .M AM−1 A

−=

−−−

−−

−−=−

500030001

0122230

2030

31

62

31

21

61

31

21

61

31

31

31

21

21

62

61

61

1AMM

The modal matrix has diagonalized the system.

= A∆

(happens to be orthonormal)

These are the eigenvalues of A!

Page 12: chap04t

"New" system is

−=

−=

=

3

2

1

3

2

1

3

2

1

3

2

1

53

500030001

xxx

xxx

xxx

Axxx

&

&&

These are three "decoupled" first order linear differential equations.

A more general system would transform as:

DuCxyBuAxx

+=+=&

xMxxMx&& =

=DuxCMy

BuMxAMMx+=

+= −− 11&A B

C D

Page 13: chap04t

Something different happens when we have one or moreeigenvalues that are repeated (multiple roots) of the characteristic equation.

Example: Find the eigenvalues and eigenvectors of

−=

533010021

A

0 1 52= − = − −A Iλ λ λ( ) ( ) So λ λ λ1 2 35 1= = =,

(1 is an eigenvalue of algebraic multiplicity 2)

Find the eigenvectors: F

Page 14: chap04t

Eigenvector corresponding to :λ1 5=

0033040024

)5( =

−−

−=− xxIA ) or

=

100

(00

1

cx

Now for the eigenvector(s?) corresponding to : λ = 1

0433000020

)( =

−=− xxIA

=

a

ax

43

2 0

Three eigenvalues, but only two eigenvectors??

rank=2, so there will be only 1 nontrivial solution.

Page 15: chap04t

Without three linearly independent eigenvectors, we cannotdiagonalize. We can do the next best thing by using "generalized eigenvectors."

There are three common ways to compute them:

I. "Bottom-up": For repeated eigenvalue , find all solutions to:

λixi

( )A I xi i− =λ 0

(these will be the regular eigenvectors)

Then for each of these 's, solve the equation xi

( )A I x xi i i− =+λ 1

If you can find 's that are linearly independent of all previous vectors , then the new vectors are generalized eigenvectors.

xi

xi+1

Page 16: chap04t

If the 's are not linearly independent of previously found vectors, continue on by solving:

xi+1

( )A I x xi i i− =+ +λ 2 1

and checking for linear independence. Continue this process until a complete set of n vectors are available.

Returning to the example:

0433000020

)( =

−=− xxIA

=

a

ax

43

1 0

Solve:

=

− 304

433000020

2x

=

325

2x

Page 17: chap04t

This vector is linearly independent of the previous eigenvectors, so it is a generalized eigenvector.

Now form the modal matrix using the two regular eigenvectors and the generalized eigenvector:

Compute similarity transformation:

=

−== −

100

110005

331

200540

533

010021

00

01

21

85

41

83

43

1AMMA

=

331200540

M

generalizedregular

Put this in the order you found them

Page 18: chap04t

=

100110005

A

This is called a Jordan (canonical) form. There are two

"Jordan blocks." In "block form", this is a "block-

diagonal" matrix. A Jordan block has the general form:

λ

λλ

0010

10001

LOOMOOO

MOL

for the repeated eigenvalue . λ

Page 19: chap04t

NOTE that just because an eigenvalue is repeated doesn't mean we will need generalized eigenvectors.

Example:

−=

200010101

A

A I− =− −

−−

= − − =λλ

λλ

λ λ1 0 1

0 1 00 0 2

1 2 02( ) ( )

Find eigenvectors: For λ = 2:

0000010101

)2( =

−−=− xxIA

−=

101

1x

For triangular matrices, theeigenvalues will lie on the diagonal.

For triangular matrices, theeigenvalues will lie on the diagonal.

Page 20: chap04t

For λ = 1:

0100000100

)( =

−=− xxIA

=

=

010

,001

32 xx

The rank deficiency of this matrix is TWO, so the dimension of its null space is TWO, so there are TWO linearly independent vectors such that the equality holds.

Altogether, we have three vectors, giving a modal matrix of:

−=

100010101

M

And a diagonalized form:

== −

200010001

1AMMA

Page 21: chap04t

Another way to compute the generalized eigenvectors is somewhat more “algorithmic”:

II. "Top down" method: First we will need the definition of aspecial integer known as the index of the eigenvalue .iη λi

iii mnIArank −=λ−η=η η)( that such smallest

is also the size of the largest Jordan blockiη

ty)multiplici algebraic-sizematrix (=

Page 22: chap04t

Now for the algorithm itself: search for all linearly independent solutions of the equations:

0)(

0)(1 ≠λ−

=λ−−η

η

x

xi

i

IA

IA

i

i

denote these solutions . There will be no more than of them (why?).

111 ,,

imvv Kim

Now compute a different “chain” of generalized eigenvectors for each : imj ,,1 K=

( ) ii mnIARank i −=− ηλ

( )imnn −−# solution =

Page 23: chap04t

REGULAR eigenvector ending each chain

“chain” of generalized eigenvectors

0)(

)(

)(

)(

1

32

21

=λ−

=λ−

=λ−

=λ−

η

η−η

i

ii

ji

jji

jji

jji

IA

IA

IA

IA

v

vv

vv

vv

M

These will be chains of “length” . If chains of shorter length are needed, start with

0)(

0)(2

1

≠λ−

=λ−−η

−η

x

xi

i

IA

IA

i

i

etc.

Page 24: chap04t

51533010021

321 =λ=λ=λ

−=A

EXAMPLE:

n m n m= = − =3 2 11 1

Consider λ = 1

2)(433000020

)( =−

−=− IArIA

1)(16612000000

)( 22 =−

−=− IArIA

∴ Index of is 2.λ = 1

Page 25: chap04t

So solve ( )

( )

A I x

A I x

− =

− ≠

21

1

0

0

to get

=−=

304

)( 12 xIAx

=

021

1x

Now generating the “chain”:

The chain stops here, and is a regular eigenvector, as can be verified by

x2

( )A I x− =2 0

(generalized)

Page 26: chap04t

Note that if there are other linearly independentsolutions to

we can initiate different chains.

( )A I x− =2 0

Finally, for this example, we note that is the regular eigenvector corresponding to , so we get:

x T= 0 0 1λ3 5=

==

= −

100110005

031200140

1AMMAM

generalizedregular

reverse order

Page 27: chap04t

0,4,0

0000

000010000100

114 =−==λ

= mnmA

ANOTHER EXAMPLE:

( )

( ) 0

0000000000000000

0

2

0000000010000100

0

2 =

=⋅−

=

=⋅−

rIAr

rIAr

= 2 =

longest chain of eigenvectors

Page 28: chap04t

Find 2 linearly independent solutions to

0)()(0)()(

11

12

12

≠=λ−==λ−

xAxIAxAxIA

try [ ] ( ) 00:0001 11 =⋅−= xIAx T Dsame for x T

1 0 1 0 0= Dtry

=

⋅−

=

0001

0100

)0(:

0100

1 IAx

generalized regular

No

No

Page 29: chap04t

and also:

=

⋅−

=

0010

1000

)0(:

1000

1 IAx

so now, generalized regular

==

= −

00001000

00000010

,

10000010

01000001

1AMMAM

regreggen gen

Page 30: chap04t

III. "Adjoint" method. This method requires computation of the adjoint of A and must be done "by hand." It is relatively tedious to do, but there is an example in Brogan (page 257).

More discussion of eigenvalues, eigenvectors, generalized eigenvectors, and Jordan forms:

Some facts:

The eigenvectors of a matrix that correspond to distincteigenvalues are linearly independent.

Page 31: chap04t

When an eigenvalue is repeated (algebraic multiplicity >1), we don't always require generalized eigenvectors. For example, an eigenvalue with algebraic multiplicity m may have linearly independent regular eigenvectors. We then would have to find only m-p generalized eigenvectors, for that eigenvalue, to get the modal matrix.

The geometric multiplicity of eigenvalue is defined to be equal to the rank deficiency (degeneracy) of the matrix It is the number of linearly independent (regular) eigenvectors we can find associated with the eigenvalue.

p m( )≤

λA I− λ .

Page 32: chap04t

xxIA

−=λ− =λ

433000020

)( 1

Recall from the example:

λ has algebraic multiplicity 2

n=3

rank=2

rank deficiency=1

λ has geometric multiplicity 1

When we use the modal matrix to find the Jordan form of a matrix, we will get one Jordan block for each regulareigenvector we can find. Similarly, the number of Jordan blocks associated with one repeated eigenvalue will be equal to the geometric multiplicity of that eigenvalue.

gmIArn =−− )( λso we can find one regular eigenvector

Page 33: chap04t

The algebraic multiplicity of will therefore be the sum of the sizes of all the Jordan blocks associated with .λ

λ

Example:

The matrix

−−

−−−

=

110000110000112000

110200001111001113

A

has A I− = −λ λ λ( )2 5

(am is the order of root) (obvious)

Page 34: chap04t

So it has eigenvalues: λ

λ1

2

2

0

=

=

, algebraic multiplicity 5

, algebraic multiplicity 1

NOTE: The A matrix itself has rank deficiency equal to the geometric multiplicity of any zero eigenvalues. (Why?)

NOTE: The A matrix itself has rank deficiency equal to the geometric multiplicity of any zero eigenvalues. (Why?)

If we compute the rank of A I− 2

we get 4, for a rank deficiencyof 6-4=2.

Therefore, in the Jordan canonical form for this matrix, we will have 2 Jordan blocks for theeigenvalue (and of course one trivial block corresponding to ). We can calculate 2regular eigenvectors and will need 3 generalizedeigenvectors.

λ = 2λ = 0

F

21 =λ

2=gm

Because the column of the modal matrix = 0

Page 35: chap04t

How could you tell there is a 3x3 block and a 2x2 block, rather than a 4x4 and a 1x1?

Recall that the index of the eigenvalue is the smallest integer such that

For this matrix and eigenvalue , , and this will be the size of the largest Jordan block associated with eigenvalue .

When we go through the exercise of finding the Jordan form, we get:

000000020000012000000200000120000012

.)( ii mnIArank i −=λ− η

λiλi

3=ηi

Number of regular eigenvectors

Page 36: chap04t

A Geometric Interpretation of All This Stuff:

Definition: Let be a subspace of linear vector space . This subspace is said to be A-invariant if for every vector,

X1 X

x X∈ 1 Ax X∈ 1.

Definition: The set of all (regular) eigenvectors corresponding to an eigenvalue forms a basis of a subspace of , called the eigenspace of . This also happens to be the null spaceof a transformation defined as

λi Xλi

A Ii− λ .

Theorem: The eigenspace of is A-invariant and has dimension equal to the degeneracy of A Ii− λ .

λi

Page 37: chap04t

Proof: Denote the eigenspace of as . We have already seen that the number of eigenvectors we will find is equal to , the rank deficiency of So we will have a basis of consisting of vectors, so the dimension of is .

Now if we take a vector we can expand it in the basis of eigenvectors as

where 's are coefficients. Then applying operator A:

Niλiqi

A Ii− λ .qi

NiNiqi

x N i∈ ,

x a eii

q

ii

==∑

1,

ai

Ax A a e a Ae a e a e Nii

q

i ii

q

i ii

q

i i ii

q

i i ii i i i

= = = = ∈= = = =∑ ∑ ∑ ∑

1 1 1 1( ) ( ) ( )λ λ

So is in by virtue of it being a linear combination of the basis vectors.Ax Ni

n

ei

Page 38: chap04t

Picture:

x

Ax

y

Ay

e1

e2

e1e2

eigenvectors of A

Plane (subspace) formed by eigenvectors of A

Plane (subspace) formed by eigenvectors of A If a vector x starts out in the

subspace, it stays in the subspace when A acts on it.

If a vector x starts out in the subspace, it stays in the subspace when A acts on it.

If a vector y is NOT in the subspace, the action of A doesn't necessarily put it there.

If a vector y is NOT in the subspace, the action of A doesn't necessarily put it there.

(plane)

Page 39: chap04t

Chapter 5: Functions of Vectors and Matrices

( )

( ):,

:,

AxxAxx

AxyAxy

T

T

=

= "Bilinear Form"

"Quadratic Form"

Note that because

( )( ) xAAxxAxAxxAxx

xAxAxxAxxT

TTTTT

TTTTT

+=+=

==

221

,

any quadratic form can be written as a quadratic form with a symmetric A-matrix. We therefore treat all quadratic forms as is they contained symmetric matrices.

Page 40: chap04t

DEFINITIONS: Let AxxQ T=

1. Q (or A) is positive definite iff :

2. Q (or A) is positive semidefinite if :

3. Q (or A) is negative definite iff:

4. Q (or A) is negative semidefinite if:

5. Q (or A) is indefinite if:

x Ax x, .> ≠0 0 for all

x Ax x, .≥ ≠0 0 for all

x Ax x, .< ≠0 0 for all

x Ax x, .≤ ≠0 0 for all

x Ax x

x Ax x

, ,

, .

> ≠

< ≠

0 0

0 0

for some and

for other

Tests for definiteness of matrix A in terms of its eigenvalues :λi F

Page 41: chap04t

1. Positive definite

2. Positive semidefinite

3. Negative definite

4. Negative semidefinite

5. Indefinite

Matrix A is . . .If the real parts of eigenvalues

of A are: . . . .λi

0> All

.0),0) <λ>λ ii Re( someRe( Some

See book for tests involving leading principal minors.

0≥ All

0< All

0≤ All