3.iii. matrix operations 3.iii.1. sums and scalar products 3.iii.2. matrix multiplication 3.iii.3....

27
3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses In Section 3.II, matrix was defined in terms of linear maps. Operations on matrices will be defined to represent operations on linear maps. Matrix algebra is a linear algebra ( , + , · ; ).

Post on 19-Dec-2015

225 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

3.III. Matrix Operations

3.III.1. Sums and Scalar Products3.III.2. Matrix Multiplication3.III.3. Mechanics of Matrix Multiplication3.III.4. Inverses

• In Section 3.II, matrix was defined in terms of linear maps.

• Operations on matrices will be defined to represent operations on linear maps.

• Matrix algebra is a linear algebra ( , + , · ; ).

Page 2: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

3.III.1. Sums and Scalar Products

Definition: Sum of Maps of the Same Domain & Codomain

The sum of maps f, g : V → W is another map

h = f + g : V → W by v f(v) + g(v)

Definition 1.3: Matrix Addition & Scalar Multiplication

Let A = ( ai j ) & B = ( bi j ) be mn matrices.

Then C = A + B is an mn matrix with elements ci j ai j + bi j .

And C = α A , with α , is an mn matrix with elements ci j α ai j .

Theorem 1.5: Matrix Representations

Let f, g : V → W be maps with bases → so that F = f → & G = g → .

Then ( f + g ) → = F + G & (α f ) → = α G , where α

Proof: Do it yourself.

Page 3: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Example 1.1: f, g : 2 → 2

1 3

2 0

1 0

f

F B D

B D

0 0

1 2

2 4

g

G B D

B D

Then 1

2

1 3

2 0

1 0

vf

v

vB DB

B D

1 2

1

1

3

2

v v

v

v

D

1

2

0 0

1 2

2 4

vg

v

vB DB

B D

1 2

1 2

0

2

2 4

v v

v v

D

1 3 0 0

2 0 1 2

1 0 2 4

F G

1 3

1 2

3 4

1 2

1 2

1 2

3

2

3 4

v v

f g f g v v

v v

v v vB B BD D D

D

1

2

1 3

1 2

3 4

v

v

B

B D

h B D H f g

B D

Page 4: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Exercise 3.III.1.

1. The trace of a square matrix is the sum of the entries on the main diagonal (the 1,1 entry plus the 2,2 entry, etc.; we will see the significance of the trace in Chapter Five). Show that trace(H + G) = trace(H) + trace(G). Is there a similar result for scalar multiplication?

Page 5: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

3.III.2. Matrix Multiplication

Lemma 2.1: A composition of linear maps is linear, i.e., If h : V → W and g : W → U are linear maps, then g h : V → U is a linear map.

Proof: Composition of linear maps preserves linear combinations. (see Hefferon, p.215)

Definition 2.3: Matrix-Multiplicative ProductThe matrix-multiplicative product of the mr matrix G and the rn matrix H is the mn matrix P, where

1

r

i j i k k jk

p g h

1

2

1 2

j

j

i i i r

r j

h

hg g g

h

G H

i.e.,

i jp

P

i j G H

j j P HG

Page 6: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Theorem 2.6: A composition of linear maps is represented by the matrix product of the representatives.

Proof:

Let h: V n → W r and g: W r → U m be linear maps with representations

Rep h h HB C B C Rep g g GC D C D

where , , and are bases of V, W and U, resp.

Let v V with1

Rep

n

v

v

v vB B

B

If w = h(v) W, then

Rep Reph h w v vC C B C B H vB

If x = g(w) = g h (v) U, then

Rep Repg h g x v wD D D G wC G H vB Rep g wC D C

Page 7: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Rep g h v G H vD B

1

r

i k kik

g w

G H vB1 1

r n

ik k j jk j

g h v

1 1

n r

ik k j jj k

g h v

1

n

ji jj

v

G H

→ Rep g h v G H vD B

∴ Rep g h G HB D Rep Repg h C D B C

V v

QED

Example 2.2, 4:

Let h : 4 → 2 and g : 2 → 3 with bases 4 , 2 , 3 so that

Rep hH B C 1 1

Rep 0 1

1 0

g g

G C D C D

C D

h B C

4 6 8 2

5 7 9 3

B C

→ Rep g hF B D 1 1

4 6 8 20 1

5 7 9 31 0

B C

C D

9 13 17 5

5 7 9 3

4 6 8 2

B D

Page 8: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Let

1

2

3

4

v

v

v

v

vB

B

then

1

2

3

4

4 6 8 2

5 7 9 3

v

v

v

v

w H vC B

B C

B

1 2 3 4

1 2 3 4

4 6 8 2

5 7 9 3

v v v v

v v v v

C

x G wD C

1 2 3 4

1 2 3 4

1 2 3 4

9 13 17 5

5 7 9 3

4 6 8 2

v v v v

v v v v

v v v v

D

x G H vD B

1 2 3 4

1 2 3 4

1 14 6 8 2

0 15 7 9 3

1 0

v v v v

v v v v

C

C D

1

2

3

4

9 13 17 5

5 7 9 3

4 6 8 2

v

v

v

v

B D

B

1 2 3 4

1 2 3 4

1 2 3 4

9 13 17 5

5 7 9 3

4 6 8 2

v v v v

v v v v

v v v v

D

suggests that matrix products are associative.

To be proved in Theorem 2.12 below.

G H v G H vB B

Page 9: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Rep g h G HB D Rep Repg h C D B C

Matrix dimensions: m r times r n equals m n.

Example 2.7: Dimension mismatch This product is not defined 1 2 0 0 0

0 10 1.1 0 2

g h

g hv v

gh

h g hv v v

Page 10: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Example 2.9: Matrix multiplication need not commute.

1 2 5 6 19 22

3 4 7 8 43 50

5 6 1 2 0 23 34 0

7 8 3 4 0 31 46 0

5 6 1 2 23 34

7 8 3 4 31 46

Example 2.10:

1 2 0 5 6is not defined

3 4 0 7 8

Theorem 2.12:Matrix products, if defined, are associative

(FG)H = F(GH) and distributive over matrix addition

F(G + H) = FG + FH and (G + H)F = GF + HF.

Proof: Hefferon p.219

Via component-sumation formulae or the corresponding composite maps.

Page 11: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Exercises 3.III.2.

1. Represent the derivative map on n with respect to → where is the natural basis ( 1, x, …, xn ). Show that the product of this matrix with itself is defined. What the map does it represent?

2. ( This will be used in the Matrix Inverses exercises.)

Here is another property of matrix multiplication that might be puzzling at first

sight.

(a) Prove that the composition of the projections πx , πy : 3 → 3 onto the x and y

axes is the zero map despite that neither one is itself the zero map.

(b) Prove that the composition of the derivatives d2 /dx2, d3 /dx3 : 4 → 4 is the

zero map despite that neither is the zero map.

(c) Give a matrix equation representing the first fact.

(d) Give a matrix equation representing the second.

When two things multiply to give zero despite that neither is zero, each is said to

be a zero divisor.

Page 12: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

3. The infinite-dimensional space of all finite-degree polynomials gives a

memorable example of the non-commutativity of linear maps.

Let d/dx: → be the usual derivative and let s: → be the shift map.

Show that the two maps don’t commute: d/dx s s d/dx

In fact, not only is d/dx s s d/dx not the zero map, it is the identity map.

2 10 1 0 10n n

n n

sa a x a x a x a x a x

Page 13: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

3.III.3. Mechanics of Matrix Multiplication

(AB)i j = ( Row i of A ) ( Column j of B )

Lemma 3.7: ( Row i of A ) B = ( Row i of AB )

1 1 9 13 17 54 6 2

5 7 35 7 3

1

80

0 4

99

6 8 2

1

4 61 1

8 29 13 17 5

1 0

0 1 5 7 9 35 7 9 3

4 6 8 2

9 13 54 6 2

5 7 35

1 1 178

0 1 997 3

4 6 21 0 8

A ( Column j of B ) = ( Column j of AB )

Page 14: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Definition 3.2: Unit MatrixA matrix with all zeroes except for a one in the i, j entry is an i, j unit matrix.

Let U be an i, j unit matrix, then

• U A → Copy row j of A to row i of zero matrix of appropriate dimensions.

• A U → Copy col i of A to col j of zero matrix of appropriate dimensions.

05 6 7

0 0 0 0 0

0

8 9 4

8 9 4

1

0 0 0 0

5 50

6 7 00 0

9 4 00 0

1

8 8

8 9 4

8 9 4

05 6 7

0 0 0 0 0

0 0

2 2 2 2

0 0 0

06 7 0

0 09 4 0

0

5

08

22

2

5

8

05 6 7

0

1

8 9 4

8 9 40 0 0 0

0 6 7

1

5

8 9 4

8 9 4

1 5 6 75 6 7

0 0 0 0 0

0 0

1

0 0 0

Page 15: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Let U be an i, j unit matrix, then

• U A → Copy row j of A to row i of zero matrix of appropriate dimensions.

• A U → Copy col i of A to col j of zero matrix of appropriate dimensions.

k l k i l ju U

k m mlk lm

u aUAk i m j ml

m

a k i j la

→ if

0

k ijk

k i

AUA

k m mlk lm

a uAUk m m i l j

m

a k i l ja

→ if

0

l jil

l j

AAU

Page 16: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Definition 3.8: Main DiagonalThe main diagonal (or principal diagonal or diagonal) of a square matrix goes from the upper left to the lower right.

Definition 3.9: Identity Matrix

1 0 0

0 1 0

0 0 1

n n

I

n n i ji j I

m n n n m n m m m n A I A I A

Definition 3.12: Diagonal MatrixA diagonal matrix is square and has zeros off the main diagonal.

11

22

0 0

0 0

0 0

n n

nn

a

a

a

A

11 22, , , nndiag a a a n n ii i ji ja A

Page 17: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Definition 3.14: Permutation MatrixA permutation matrix is square and is all zeros except for a single one in each row and column.From the left (right) these matrices permute rows (columns).

0 0 1 1 2 3

1 0 0 4 5 6

0 1 0 7 8 9

1 2 3 0 0 1

4 5 6 1 0 0

7 8 9 0 1 0

3rd row to 1st 1st row to 2nd 2nd row to 3rd

7 8 9

1 2 3

4 5 6

2 3 1

5 6 4

8 9 7

1st column to 3rd 2nd column to 1st 3rd column to 2nd

Page 18: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Definition 3.18: Elementary Reduction MatricesThe elementary reduction matrices are obtained from identity matrices with one Gaussian operation. We denote them:

1 i ii

kk

I M

,2 i ji j

I P

,3 j i ji j

kk

I C

2

1 0 0

3 0 3 0

0 0 1

M

2, 3

1 0 0

0 0 1

0 1 0

P

2 , 3

1 0 0

3 0 1 0

0 3 1

C

2 , 3

1 0 0

3 0 1 0

0 3 1

a b

c d

e f

C A

3 3

a b

c d

c e d f

Page 19: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

The Gauss’s method and Gauss-Jordan reduction can be accomplished by a single matrix that is the product of elementary reduction matrices.

Corollary 3.22:For any matrix H there are elementary reduction matrices R1, . . . , Rr such that Rr Rr1 ··· R1 H is in reduced echelon form.

Page 20: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Exercises 3.III.3

1. The need to take linear combinations of rows and columns in tables of numbers arises often in practice. For instance, this is a map of part of Vermont and New York.

In part because of Lake Champlain,there are no roads directly connecting some pairs of towns. For instance,there is no way to go from Winooski to Grand Isle without going through Colchester. (Of course, many other roads and towns have been left off to simplify the graph. From top to bottom of this map is about forty miles.)

Continued next page.

Page 21: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

(a) The incidence matrix of a map is the square matrix whose i, j entry is the number of roads from city i to city j. Produce the incidence matrix of this map (take the cities in alphabetical order).(b) A matrix is symmetric if it equals its transpose. Show that an incidence matrix is symmetric. (These are all two-way streets. Vermont doesn’t have many one-way streets.)(c) What is the significance of the square of the incidence matrix? The cube?

2. The trace of a square matrix is the sum of the entries on its diagonal (itssignificance appears in Chapter Five). Show that trace(GH) = trace(HG).

3. A square matrix is a Markov matrix if each entry is between zero and oneand the sum along each row is one. Prove that a product of Markov matrices is Markov.

Page 22: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

3.III.4. Inverses

Example 4.1:

Let π: 3 → 2 be the projection map

xx

yy

z

and η : 2 → 3 be the embedding

0

xx

yy

The composition π η: 2 → 2 is the identity map on 2 :

0

xx x

yy y

π is a left inverse map of η.η is a right inverse map of π.

The composition η π : 3 → 3 is not the identity map on 3 :

0

x xx

y yy

z

π has no left inverse.η has no right inverse.

π η= id

Page 23: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

The zero map has neither left nor right inverse.

Definition 4.2: Left / Right Inverse & Invertible MatricesA matrix G is a left inverse matrix of the matrix H if GH = I . It is a right inverse matrix if HG = I .A matrix H with a two-sided inverse is an invertible matrix. That two-sided inverse is called the inverse matrix and is denoted H1 .

Lemma 4.3:If a matrix has both a left inverse and a right inverse then the two are equal.

Theorem 4.4: A matrix is invertible if and only if it is nonsingular (square).

Proof: Statement is true on the corresponding linear maps.

Proof: Statement is true on the corresponding linear maps (1-1).

Lemma 4.5:A product of invertible matrices is invertible: ( GH )1 = H 1 G 1

Proof: Statement is true on the corresponding linear maps.

Page 24: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Lemma 4.8:A matrix is invertible it can be written as the product of elem reduction matrices. The inverse can be computed by applying to I the same row steps, in the same order, as are used to Gauss-Jordan reduce the invertible matrix.Proof: An invertible matrix is row equivalent to I.

Let R be the product of the required elementary reduction matrices so that RA = I.

A X I → R A X R I X R I→

Then

Page 25: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Example 4.10:

Example 4.11: A Non-Invertible Matrix

Corollary 4.12:The inverse for a 22 matrix exists and equals

if and only if ad bc 0.

Page 26: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

1 0

0 1

HG

1 0 0

0 1 0

0 0 1

HG

Right inverse of H and left inverse of G exists

Left inverse of H and right inverse of G doesn’t exist

g assumed 1-1

Page 27: 3.III. Matrix Operations 3.III.1. Sums and Scalar Products 3.III.2. Matrix Multiplication 3.III.3. Mechanics of Matrix Multiplication 3.III.4. Inverses

Exercises 3.III.4

2. Show that if T is square and if T4 is the zero matrix then

1 2 31 1 T T T T

Generalize.

1. Show that the matrix

Show also that it has no left inverse.

1 0 1

0 1 0

has infinitely many right inverses.

3. Prove: if the sum of the elements of a square matrix is k, then the sum of the elements in each row of the inverse matrix is 1/k.