ch 7.3: systems of linear equations, linear independence, eigenvalues

29
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues A system of n linear equations in n variables, can be expressed as a matrix equation Ax = b: If b = 0, then system is homogeneous; otherwise it is nonhomogeneous. n n n n n n n n b b b x x x a a a a a a a a a 2 1 2 1 , 2 , 1 , , 2 2 , 2 1 , 2 , 1 2 , 1 1 , 1 , , 2 2 , 1 1 , 2 , 2 2 2 , 2 1 1 , 2 1 , 1 2 2 , 1 1 1 , 1 n n n n n n n n n n b x a x a x a b x a x a x a b x a x a x a

Upload: frisco

Post on 19-Mar-2016

45 views

Category:

Documents


0 download

DESCRIPTION

Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues. A system of n linear equations in n variables, can be expressed as a matrix equation Ax = b : If b = 0 , then system is homogeneous ; otherwise it is nonhomogeneous. Nonsingular Case. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

A system of n linear equations in n variables,

can be expressed as a matrix equation Ax = b:

If b = 0, then system is homogeneous; otherwise it is nonhomogeneous.

nnnnnn

n

n

b

bb

x

xx

aaa

aaaaaa

2

1

2

1

,2,1,

,22,21,2

,12,11,1

,,22,11,

2,222,211,2

1,122,111,1

nnnnnn

nn

nn

bxaxaxa

bxaxaxa

bxaxaxa

Page 2: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Nonsingular CaseIf the coefficient matrix A is nonsingular, then it is invertible and we can solve Ax = b as follows:

This solution is therefore unique. Also, if b = 0, it follows that the unique solution to Ax = 0 is x = A-10 = 0. Thus if A is nonsingular, then the only solution to Ax = 0 is the trivial solution x = 0.

bAxbAIxbAAxAbAx 1111

Page 3: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 1: Nonsingular Case (1 of 3)

From a previous example, we know that the matrix A below is nonsingular with inverse as given.

Using the definition of matrix multiplication, it follows that the only solution of Ax = 0 is x = 0:

2/122/31422/372/9

,834301210

1AA

000

000

2/122/31422/372/9

10Ax

Page 4: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 1: Nonsingular Case (2 of 3)

Now let’s solve the nonhomogeneous linear system Ax = b below using A-1:

This system of equations can be written as Ax = b, where

Then

08342301220

321

321

321

xxxxxxxxx

71223

022

2/122/31422/372/9

1bAx

022

,,834301210

3

2

1

bxAxxx

Page 5: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 1: Nonsingular Case (3 of 3)

Alternatively, we could solve the nonhomogeneous linear system Ax = b below using row reduction.

To do so, form the augmented matrix (A|b) and reduce, using elementary row operations.

71223

72223

710022102301

1420022102301

843022102301

083422102301

083423012210

3

32

31

x

bA

xxxxx

08342301220

321

321

321

xxxxxxxxx

Page 6: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Singular CaseIf the coefficient matrix A is singular, then A-1 does not exist, and either a solution to Ax = b does not exist, or there is more than one solution (not unique). Further, the homogeneous system Ax = 0 has more than one solution. That is, in addition to the trivial solution x = 0, there are infinitely many nontrivial solutions.The nonhomogeneous case Ax = b has no solution unless (b, y) = 0, for all vectors y satisfying A*y = 0, where A* is the adjoint of A. In this case, Ax = b has solutions (infinitely many), each of the form x = x(0) + , where x(0) is a particular solution of Ax = b, and is any solution of Ax = 0.

Page 7: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 2: Singular Case (1 of 3)

Solve the nonhomogeneous linear system Ax = b below using row reduction.

To do so, form the augmented matrix (A|b) and reduce, using elementary row operations.

soln no1015312

100015301121

400015301121

353015301121

6106015301121

154506511121

3

32

321

xxxxxx

bA

154506511121

321

321

321

xxxxxxxxx

Page 8: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 2: Singular Case (2 of 3)

Solve the nonhomogeneous linear system Ax = b below using row reduction.

Reduce the augmented matrix (A|b) as follows:

072

27

21000

530121

25

21530

530121

51060530121

545651121

123

123

12

1

13

12

1

13

12

1

3

2

1

bbb

bbb

bbb

bb

bbb

bbbbb

bbb

bA

3321

2321

1321

545651121

bxxxbxxxbxxx

Page 9: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 2: Singular Case (3 of 3)

From the previous slide, we require

Suppose

Then the reduced augmented matrix (A|b) becomes:

ξxxxx

)0(3

3

3

3

3

32

321

123

12

1

357

001

13/53/7

001

3/53/71

00053112

27

21000

530121

cxx

xx

xxxxxx

bbb

bbb

072 123 bbb

5,1,1 321 bbb

Page 10: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Linear Dependence and IndependenceA set of vectors x(1), x(2),…, x(n) is linearly dependent if there exists scalars c1, c2,…, cn, not all zero, such that

If the only solution of

is c1= c2 = …= cn = 0, then x(1), x(2),…, x(n) is linearly independent.

0xxx )()2(2

)1(1

nnccc

0xxx )()2(2

)1(1

nnccc

Page 11: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 3: Linear Independence (1 of 2)

Determine whether the following vectors are linear dependent or linearly independent.

We need to solve

or

000

834301210

000

832

301

410

3

2

1

21

ccc

ccc

0xxx )3(3

)2(2

)1(1 ccc

832

,301

,410

)3()2()1( xxx

Page 12: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 3: Linear Independence (2 of 2)

We thus reduce the augmented matrix (A|b), as before.

Thus the only solution is c1= c2 = …= cn = 0, and therefore the original vectors are linearly independent.

000

00203

010002100301

083403010210

3

32

31

c

bA

ccccc

Page 13: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 4: Linear Dependence (1 of 2)

Determine whether the following vectors are linear dependent or linearly independent.

We need to solve

or

561

,452

,511

)3()2()1( xxx

0xxx )3(3

)2(2

)1(1 ccc

000

545651121

000

561

452

511

3

2

1

321

ccc

ccc

Page 14: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 4: Linear Dependence (2 of 2)

We thus reduce the augmented matrix (A|b), as before.

Thus the original vectors are linearly dependent, with

357

3/53/7

00053012

000005300121

054506510121

3

3

3

3

32

321

kc

cc

cccccc

cc

bA

000

561

3452

5511

7

Page 15: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Linear Independence and InvertibilityConsider the previous two examples:

The first matrix was known to be nonsingular, and its column vectors were linearly independent. The second matrix was known to be singular, and its column vectors were linearly dependent.

This is true in general: the columns (or rows) of A are linearly independent iff A is nonsingular iff A-1 exists.Also, A is nonsingular iff detA 0, hence columns (or rows) of A are linearly independent iff detA 0.Further, if A = BC, then det(C) = det(A)det(B). Thus if the columns (or rows) of A and B are linearly independent, then the columns (or rows) of C are also.

Page 16: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Linear Dependence & Vector FunctionsNow consider vector functions x(1)(t), x(2)(t),…, x(n)(t), where

As before, x(1)(t), x(2)(t),…, x(n)(t) is linearly dependent on I if there exists scalars c1, c2,…, cn, not all zero, such that

Otherwise x(1)(t), x(2)(t),…, x(n)(t) is linearly independent on ISee text for more discussion on this.

,,,,2,1,

)(

)()(

)(

)(

)(2

)(1

Itnk

tx

txtx

t

km

k

k

k

x

Ittctctc nn allfor ,)()()( )()2(

2)1(

1 0xxx

Page 17: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Eigenvalues and EigenvectorsThe eqn. Ax = y can be viewed as a linear transformation that maps (or transforms) x into a new vector y. Nonzero vectors x that transform into multiples of themselves are important in many applications. Thus we solve Ax = x or equivalently, (A-I)x = 0. This equation has a nonzero solution if we choose such that det(A-I) = 0. Such values of are called eigenvalues of A, and the nonzero solutions x are called eigenvectors.

Page 18: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 5: Eigenvalues (1 of 3)

Find the eigenvalues and eigenvectors of the matrix A.

Solution: Choose such that det(A-I) = 0, as follows.

6332

A

7,373214

336263

32det

0001

6332

detdet

2

IA

Page 19: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 5: First Eigenvector (2 of 3)

To find the eigenvectors of the matrix A, we need to solve (A-I)x = 0 for = 3 and = -7. Eigenvector for = 3: Solve

by row reducing the augmented matrix:

00

9331

00

363332

2

1

2

1

xx

xx

0xIA

13

choosearbitrary,133

00031

000031

093031

093031

)1(

2

2)1(

2

21

xx ccxx

xxx

Page 20: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 5: Second Eigenvector (3 of 3)

Eigenvector for = -7: Solve

by row reducing the augmented matrix:

00

1339

00

763372

2

1

2

1

xx

xx

0xIA

31

choosearbitrary,1

3/13/1

0003/11

00003/11

01303/11

013039

)2(

2

2)2(

2

21

xx ccxx

xxx

Page 21: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Normalized EigenvectorsFrom the previous example, we see that eigenvectors are determined up to a nonzero multiplicative constant. If this constant is specified in some particular way, then the eigenvector is said to be normalized. For example, eigenvectors are sometimes normalized by choosing the constant so that ||x|| = (x, x)½ = 1.

Page 22: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Algebraic and Geometric MultiplicityIn finding the eigenvalues of an n x n matrix A, we solve det(A-I) = 0. Since this involves finding the determinant of an n x n matrix, the problem reduces to finding roots of an nth degree polynomial. Denote these roots, or eigenvalues, by 1, 2, …, n.

If an eigenvalue is repeated m times, then its algebraic multiplicity is m. Each eigenvalue has at least one eigenvector, and a eigenvalue of algebraic multiplicity m may have q linearly independent eigevectors, 1 q m, and q is called the geometric multiplicity of the eigenvalue.

Page 23: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Eigenvectors and Linear IndependenceIf an eigenvalue has algebraic multiplicity 1, then it is said to be simple, and the geometric multiplicity is 1 also. If each eigenvalue of an n x n matrix A is simple, then A has n distinct eigenvalues. It can be shown that the n eigenvectors corresponding to these eigenvalues are linearly independent. If an eigenvalue has one or more repeated eigenvalues, then there may be fewer than n linearly independent eigenvectors since for each repeated eigenvalue, we may have q < m. This may lead to complications in solving systems of differential equations.

Page 24: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 6: Eigenvalues (1 of 5)

Find the eigenvalues and eigenvectors of the matrix A.

Solution: Choose such that det(A-I) = 0, as follows.

011101110

A

1,1,2)1)(2(

23

111111

detdet

221

2

3

IA

Page 25: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 6: First Eigenvector (2 of 5)

Eigenvector for = 2: Solve (A-I)x = 0, as follows.

111

choosearbitrary,111

00011011

000001100101

000001100211

033003300211

011201210211

021101210112

)1(

3

3

3)1(

3

32

31

xx ccxxx

xxxxx

Page 26: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 6: 2nd and 3rd Eigenvectors (3 of 5)

Eigenvector for = -1: Solve (A-I)x = 0, as follows.

110

, 101

choose

arbitrary,where,101

011

00000111

000000000111

011101110111

)3()2(

3232

3

2

32)2(

3

2

321

xx

x xxxxxx

xx

xx

xxx

Page 27: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 6: Eigenvectors of A (4 of 5)

Thus three eigenvectors of A are

where x(2), x(3) correspond to the double eigenvalue = - 1.It can be shown that x(1), x(2), x(3) are linearly independent. Hence A is a 3 x 3 symmetric matrix (A = AT ) with 3 real eigenvalues and 3 linearly independent eigenvectors.

110

, 101

,111

)3()2()1( xxx

011101110

A

Page 28: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Example 6: Eigenvectors of A (5 of 5)

Note that we could have we had chosen

Then the eigenvectors are orthogonal, since

Thus A is a 3 x 3 symmetric matrix with 3 real eigenvalues and 3 linearly independent orthogonal eigenvectors.

121

, 101

,111

)3()2()1( xxx

0,,0,,0, )3()2()3()1()2()1( xxxxxx

Page 29: Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues

Hermitian MatricesA self-adjoint, or Hermitian matrix, satisfies A = A*, where we recall that A* = AT . Thus for a Hermitian matrix, aij = aji.

Note that if A has real entries and is symmetric (see last example), then A is Hermitian. An n x n Hermitian matrix A has the following properties:

All eigenvalues of A are real.There exists a full set of n linearly independent eigenvectors of A.If x(1) and x(2) are eigenvectors that correspond to different eigenvalues of A, then x(1) and x(2) are orthogonal. Corresponding to an eigenvalue of algebraic multiplicity m, it is possible to choose m mutually orthogonal eigenvectors, and hence A has a full set of n linearly independent orthogonal eigenvectors.