introduction to linear algebra mark goldman emily mackevicius
TRANSCRIPT
![Page 1: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/1.jpg)
Introduction to Linear Algebra
Mark GoldmanEmily Mackevicius
![Page 2: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/2.jpg)
Outline
1. Matrix arithmetic2. Matrix properties3. Eigenvectors & eigenvalues-BREAK-4. Examples (on blackboard)5. Recap, additional matrix properties, SVD
![Page 3: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/3.jpg)
Part 1: Matrix Arithmetic(w/applications to neural networks)
![Page 4: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/4.jpg)
Matrix addition
![Page 5: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/5.jpg)
Scalar times vector
![Page 6: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/6.jpg)
Scalar times vector
![Page 7: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/7.jpg)
Scalar times vector
![Page 8: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/8.jpg)
Product of 2 Vectors
• Element-by-element• Inner product• Outer product
Three ways to multiply
![Page 9: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/9.jpg)
Element-by-element product (Hadamard product)
• Element-wise multiplication (.* in MATLAB)
![Page 10: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/10.jpg)
Dot product (inner product)
![Page 11: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/11.jpg)
Multiplication: Dot product (inner product)
![Page 12: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/12.jpg)
Multiplication: Dot product (inner product)
![Page 13: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/13.jpg)
Multiplication: Dot product (inner product)
![Page 14: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/14.jpg)
Multiplication: Dot product (inner product)
1 X N N X 1 1 X 1
• MATLAB: ‘inner matrix dimensions must agree’ Outer dimensions give size of resulting matrix
![Page 15: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/15.jpg)
Dot product geometric intuition:“Overlap” of 2 vectors
![Page 16: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/16.jpg)
Example: linear feed-forward network
Input neurons’Firing rates
r1
r2
rn
ri
![Page 17: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/17.jpg)
Example: linear feed-forward network
Input neurons’Firing rates
Synaptic weightsr1
r2
rn
ri
![Page 18: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/18.jpg)
Example: linear feed-forward network
Input neurons’Firing rates
Output neuron’s firing rate
Synaptic weightsr1
r2
rn
ri
![Page 19: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/19.jpg)
Example: linear feed-forward network
• Insight: for a given input (L2) magnitude, the response is maximized when the input is parallel to the weight vector
• Receptive fields also can be thought of this way
Input neurons’Firing rates
Output neuron’s firing rate
Synaptic weightsr1
r2
rn
ri
![Page 20: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/20.jpg)
Multiplication: Outer product
N X 1 1 X M N X M
![Page 21: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/21.jpg)
Multiplication: Outer product
![Page 22: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/22.jpg)
Multiplication: Outer product
![Page 23: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/23.jpg)
Multiplication: Outer product
![Page 24: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/24.jpg)
Multiplication: Outer product
![Page 25: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/25.jpg)
Multiplication: Outer product
![Page 26: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/26.jpg)
Multiplication: Outer product
![Page 27: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/27.jpg)
Multiplication: Outer product
• Note: each column or each row is a multiple of the others
![Page 28: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/28.jpg)
Matrix times vector
![Page 29: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/29.jpg)
Matrix times vector
![Page 30: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/30.jpg)
Matrix times vector
M X 1 M X N N X 1
![Page 31: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/31.jpg)
Matrix times vector: inner product interpretation
• Rule: the ith element of y is the dot product of the ith row of W with x
![Page 32: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/32.jpg)
Matrix times vector: inner product interpretation
• Rule: the ith element of y is the dot product of the ith row of W with x
![Page 33: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/33.jpg)
Matrix times vector: inner product interpretation
• Rule: the ith element of y is the dot product of the ith row of W with x
![Page 34: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/34.jpg)
Matrix times vector: inner product interpretation
• Rule: the ith element of y is the dot product of the ith row of W with x
![Page 35: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/35.jpg)
Matrix times vector: inner product interpretation
• Rule: the ith element of y is the dot product of the ith row of W with x
![Page 36: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/36.jpg)
Matrix times vector: outer product interpretation
• The product is a weighted sum of the columns of W, weighted by the entries of x
![Page 37: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/37.jpg)
Matrix times vector: outer product interpretation
• The product is a weighted sum of the columns of W, weighted by the entries of x
![Page 38: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/38.jpg)
Matrix times vector: outer product interpretation
• The product is a weighted sum of the columns of W, weighted by the entries of x
![Page 39: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/39.jpg)
Matrix times vector: outer product interpretation
• The product is a weighted sum of the columns of W, weighted by the entries of x
![Page 40: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/40.jpg)
Example of the outer product method
![Page 41: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/41.jpg)
Example of the outer product method
(3,1)(0,2)
![Page 42: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/42.jpg)
Example of the outer product method
(3,1)
(0,4)
![Page 43: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/43.jpg)
Example of the outer product method
(3,5)• Note: different
combinations of the columns of M can give you any vector in the plane(we say the columns of M “span” the plane)
![Page 44: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/44.jpg)
Rank of a Matrix
• Are there special matrices whose columns don’t span the full plane?
![Page 45: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/45.jpg)
Rank of a Matrix
• Are there special matrices whose columns don’t span the full plane?
(1,2)
(-2, -4)
• You can only get vectors along the (1,2) direction (i.e. outputs live in 1 dimension, so we call the matrix rank 1)
![Page 46: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/46.jpg)
Example: 2-layer linear network
• Wij is the connection strength (weight) onto neuron yi from neuron xj.
![Page 47: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/47.jpg)
Example: 2-layer linear network: inner product point of view
• What is the response of cell yi of the second layer?
• The response is the dot product of the ith row of W with the vector x
![Page 48: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/48.jpg)
Example: 2-layer linear network: outer product point of view
• How does cell xj contribute to the pattern of firing of layer 2?
Contribution of xj to
network output
1st column of W
![Page 49: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/49.jpg)
Product of 2 Matrices
• MATLAB: ‘inner matrix dimensions must agree’• Note: Matrix multiplication doesn’t (generally) commute, AB BA
N X P P X M N X M
![Page 50: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/50.jpg)
Matrix times Matrix:by inner products
• Cij is the inner product of the ith row with the jth column
![Page 51: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/51.jpg)
Matrix times Matrix:by inner products
• Cij is the inner product of the ith row with the jth column
![Page 52: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/52.jpg)
Matrix times Matrix:by inner products
• Cij is the inner product of the ith row with the jth column
![Page 53: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/53.jpg)
Matrix times Matrix:by inner products
•Cij is the inner product of the ith row of A with the jth column of B
![Page 54: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/54.jpg)
Matrix times Matrix:by outer products
![Page 55: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/55.jpg)
Matrix times Matrix:by outer products
![Page 56: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/56.jpg)
Matrix times Matrix:by outer products
![Page 57: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/57.jpg)
Matrix times Matrix:by outer products
• C is a sum of outer products of the columns of A with the rows of B
![Page 58: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/58.jpg)
Part 2: Matrix Properties• (A few) special matrices• Matrix transformations & the determinant• Matrices & systems of algebraic equations
![Page 59: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/59.jpg)
Special matrices: diagonal matrix
• This acts like scalar multiplication
![Page 60: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/60.jpg)
Special matrices: identity matrix
for all
![Page 61: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/61.jpg)
Special matrices: inverse matrix
• Does the inverse always exist?
![Page 62: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/62.jpg)
How does a matrix transform a square?
(1,0)
(0,1)
![Page 63: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/63.jpg)
How does a matrix transform a square?
(1,0)
(0,1)
![Page 64: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/64.jpg)
How does a matrix transform a square?
(3,1)(0,2)
(1,0)
(0,1)
![Page 65: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/65.jpg)
Geometric definition of the determinant: How does a matrix transform a square?
(1,0)
(0,1)
![Page 66: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/66.jpg)
Example: solve the algebraic equation
![Page 67: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/67.jpg)
Example: solve the algebraic equation
![Page 68: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/68.jpg)
Example: solve the algebraic equation
![Page 69: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/69.jpg)
Example of an underdetermined system
• Some non-zero vectors are sent to 0
![Page 70: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/70.jpg)
• Some non-zero vectors are sent to 0
Example of an underdetermined system
![Page 71: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/71.jpg)
Example of an underdetermined system
![Page 72: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/72.jpg)
Example of an underdetermined system
• Some non-zero x are sent to 0 (the set of all x with Mx=0 are called the “nullspace” of M)
• This is because det(M)=0 so M is not invertible. (If det(M) isn’t 0, the only solution is x = 0)
![Page 73: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/73.jpg)
Part 3: Eigenvectors & eigenvalues
![Page 74: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/74.jpg)
What do matrices do to vectors?
(3,1)(0,2) (2,1)
![Page 75: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/75.jpg)
Recall
(3,5)
(2,1)
![Page 76: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/76.jpg)
What do matrices do to vectors?
(3,5)
• The new vector is:1) rotated 2) scaled
(2,1)
![Page 77: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/77.jpg)
Are there any special vectors that only get scaled?
![Page 78: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/78.jpg)
Are there any special vectors that only get scaled?
Try (1,1)
![Page 79: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/79.jpg)
Are there any special vectors that only get scaled?
= (1,1)
![Page 80: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/80.jpg)
Are there any special vectors that only get scaled?
= (3,3)
= (1,1)
![Page 81: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/81.jpg)
Are there any special vectors that only get scaled?
• For this special vector, multiplying by M is like multiplying by a scalar.
• (1,1) is called an eigenvector of M
• 3 (the scaling factor) is called the eigenvalue associated with this eigenvector
= (1,1)
= (3,3)
![Page 82: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/82.jpg)
Are there any other eigenvectors?
![Page 83: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/83.jpg)
Are there any other eigenvectors?
• Yes! The easiest way to find is with MATLAB’s eig command.
• Exercise: verify that (-1.5, 1) is also an eigenvector of M.
![Page 84: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/84.jpg)
Are there any other eigenvectors?
• Yes! The easiest way to find is with MATLAB’s eig command.
• Exercise: verify that (-1.5, 1) is also an eigenvector of M with eigenvalue l2 = -2
• Note: eigenvectors are only defined up to a scale factor. – Conventions are either make e’s unit vectors, or make one of
the elements 1
![Page 85: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/85.jpg)
Step back: Eigenvectors obey this equation
![Page 86: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/86.jpg)
Step back: Eigenvectors obey this equation
![Page 87: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/87.jpg)
Step back: Eigenvectors obey this equation
![Page 88: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/88.jpg)
Step back: Eigenvectors obey this equation
• This is called the characteristic equation for l • In general, for an N x N matrix, there are N
eigenvectors
![Page 89: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/89.jpg)
BREAK
![Page 90: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/90.jpg)
Part 4: Examples (on blackboard)• Principal Components Analysis (PCA)• Single, linear differential equation• Coupled differential equations
![Page 91: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/91.jpg)
Part 5: Recap & Additional useful stuff • Matrix diagonalization recap:
transforming between original & eigenvector coordinates• More special matrices & matrix properties • Singular Value Decomposition (SVD)
![Page 92: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/92.jpg)
Coupled differential equations
• Calculate the eigenvectors and eigenvalues. – Eigenvalues have typical form:
• The corresponding eigenvector component has dynamics:
![Page 93: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/93.jpg)
• Step 1: Find the eigenvalues and eigenvectors of M.
• Step 2: Decompose x into its eigenvector components
• Step 3: Stretch/scale each eigenvalue component
• Step 4: (solve for c and) transform back to original coordinates.
Practical program for approaching equations coupled through a term Mx
eig(M) in MATLAB
![Page 94: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/94.jpg)
• Step 1: Find the eigenvalues and eigenvectors of M.
• Step 2: Decompose x into its eigenvector components
• Step 3: Stretch/scale each eigenvalue component
• Step 4: (solve for c and) transform back to original coordinates.
Practical program for approaching equations coupled through a term Mx
![Page 95: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/95.jpg)
• Step 1: Find the eigenvalues and eigenvectors of M.
• Step 2: Decompose x into its eigenvector components
• Step 3: Stretch/scale each eigenvalue component
• Step 4: (solve for c and) transform back to original coordinates.
Practical program for approaching equations coupled through a term Mx
![Page 96: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/96.jpg)
• Step 1: Find the eigenvalues and eigenvectors of M.
• Step 2: Decompose x into its eigenvector components
• Step 3: Stretch/scale each eigenvalue component
• Step 4: (solve for c and) transform back to original coordinates.
Practical program for approaching equations coupled through a term Mx
![Page 97: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/97.jpg)
• Step 1: Find the eigenvalues and eigenvectors of M.
• Step 2: Decompose x into its eigenvector components
• Step 3: Stretch/scale each eigenvalue component
• Step 4: (solve for c and) transform back to original coordinates.
Practical program for approaching equations coupled through a term Mx
![Page 98: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/98.jpg)
• Step 1: Find the eigenvalues and eigenvectors of M.
• Step 2: Decompose x into its eigenvector components
• Step 3: Stretch/scale each eigenvalue component
• Step 4: (solve for c and) transform back to original coordinates.
Practical program for approaching equations coupled through a term Mx
![Page 99: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/99.jpg)
Where (step 1):
MATLAB:
Putting it all together…
![Page 100: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/100.jpg)
Step 2: Transform into eigencoordinates
Step 3: Scale by li along the ith eigencoordinate
Step 4: Transform back to original coordinate system
Putting it all together…
![Page 101: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/101.jpg)
Left eigenvectors
-The rows of E inverse are called the left eigenvectorsbecause they satisfy E-1 M = L E-1. -Together with the eigenvalues, they determine how x is decomposed into each of its eigenvector components.
![Page 102: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/102.jpg)
Matrix in eigencoordinate
system
Original Matrix
Where:
Putting it all together…
![Page 103: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/103.jpg)
• Note: M and Lambda look very different. Q: Are there any properties that are preserved between them? A: Yes, 2 very important ones:
Matrix in eigencoordinate system
Original Matrix
Trace and Determinant
2.
1.
![Page 104: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/104.jpg)
Special Matrices: Normal matrix• Normal matrix: all eigenvectors are orthogonal
Can transform to eigencoordinates (“change basis”) with a simple rotation of the coordinate axes
E is a rotation (unitary or orthogonal) matrix, defined by:
EPicture:
![Page 105: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/105.jpg)
Special Matrices: Normal matrix• Normal matrix: all eigenvectors are orthogonal
Can transform to eigencoordinates (“change basis”) with a simple rotation of the coordinate axes
E is a rotation (unitary or orthogonal) matrix, defined by:
where if: then:
![Page 106: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/106.jpg)
Special Matrices: Normal matrix• Eigenvector decomposition in this case:
• Left and right eigenvectors are identical!
![Page 107: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/107.jpg)
• Symmetric Matrix:
Special Matrices
• e.g. Covariance matrices, Hopfield network• Properties: – Eigenvalues are real– Eigenvectors are orthogonal (i.e. it’s a normal matrix)
![Page 108: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/108.jpg)
SVD: Decomposes matrix into outer products (e.g. of a neural/spatial mode and a temporal mode)
t = 1 t = 2 t = T
n = 1
n = 2
n = N
![Page 109: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/109.jpg)
SVD: Decomposes matrix into outer products (e.g. of a neural/spatial mode and a temporal mode)
n = 1
n = 2
n = N
t = 1 t = 2 t = T
![Page 110: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/110.jpg)
SVD: Decomposes matrix into outer products (e.g. of a neural/spatial mode and a temporal mode)
Rows of VT are eigenvectors of MTM
Columns of U are eigenvectors
of MMT
• Note: the eigenvalues are the same for MTM and MMT
![Page 111: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/111.jpg)
SVD: Decomposes matrix into outer products (e.g. of a neural/spatial mode and a temporal mode)
Rows of VT are eigenvectors of MTM
Columns of U are eigenvectors
of MMT
• Thus, SVD pairs “spatial” patterns with associated “temporal” profiles through the outer product
![Page 112: Introduction to Linear Algebra Mark Goldman Emily Mackevicius](https://reader036.vdocuments.us/reader036/viewer/2022062322/5697bfab1a28abf838c9ae89/html5/thumbnails/112.jpg)
The End