vector spaces notes

7

Click here to load reader

Upload: shashank-dixena

Post on 20-Jul-2016

13 views

Category:

Documents


0 download

DESCRIPTION

Notes on Vector Spaces from the Course Basics of Modern Control Systems

TRANSCRIPT

Page 1: Vector Spaces Notes

EE 650 - Basics of Modern Control Systems

Vector Spaces

Definition, Linear independence, Linear combination, Span, Basis, Dimension, Subspace, Linear transforma-tions, Invariance, Change of basis, Similarity transformation, Examples.

1 Definition

A vector spaceis a setV of elements calledvectorswhich satisfy the following axioms:

• ∀x, y ∈ V, x+ y ∈ V. (CLOSURE)

• ∀x, y ∈ V, x+ y = y + x. (COMMUTATIVE)

• ∀x, y, z ∈ V, x+ (y + z) = (x+ y) + z. (ASSOCIATIVE)

• ∃ in V a unique vector0 such thatx+ 0 = x, ∀X ∈ V. (ADDITIVE IDENTITY)

• to every vectorx ∈ V there corresponds a unique vector−x such thatx + (−x) = 0. (ADDITIVEINVERSE)

• α ∈ F andx ∈ V, αx ∈ V. (CLOSURE UNDER SCALAR MULTIPLICATION)

• α, β ∈ F andx ∈ V, α(βx) = (αβ)x.

• 1x = x for every vectorx ∈ V.

• α(x+ y) = αx+ αy ∀α ∈ F andx, y ∈ V. (DISTRIBUTIVE w.r.t vectors)

• (α + β)x = αx+ βx ∀α, β ∈ F andx ∈ V. (DISTRIBUTIVE w.r.t. scalars)

1.1 Examples

Z - set of integers,Q - set of rationals,R - real line,Rn, P - set of polynomials and lots of more.Is a set of3× 3 matrices a vector spaces? Is a set of3× 3 invertible matrices a vector space?

2 Linear Independence

A finite set{vi} of vectors is linearly dependent if there exists a corresponding set{αi} of scalars, not all zerosuch that

i

αivi = 0.

i αivi = 0 ⇒ αi = 0 ∀i = 1, 2, . . . and nonzerovi ∈ Vm

The set{vi} of vectors is linearly independent.

1

Page 2: Vector Spaces Notes

3 Linear combinations

Every vector inV can be expressed as a linear combination of other vectors inV.

v =∑

i

αiwi,

wherev, wi ∈ V andαi ∈ F for all i = 1, 2, . . .. A set{xi} of vectors inV is thespanof V if every vector inVcan be expressed as a linear combination of the vectorsxi.

A basisin a vector spaceV is a setX of linearly independent vectors such that every vector inV is a linearcombination of elements ofX . In other words,basisin V is a set of linearly independent vectors that span theentire vector spaceV. If the basis ofV has finite number of elements then the vector spaceV isfinite-dimensional.

3.1 Examples

{(1, 0, 0), (1, 2, 1), (1, 3, 4), (0, 1, 0)} spanR3. Note that the vectors in the set are not linearly independent.{(1, 0, 0), (0, 1, 0), (0, 0, 1)} are linearly independent and spanR3. So the set is the basis ofR3. Similarly,{(1, 0, 0), (0, 1, 0), (1, 2, 1)} is a basis ofR3. The basis of a vector space is not unique. The dimension ofR3 isthree. A set of polynomialsP is infinite-dimensional.

A vector space definition is complete with its underlying field (from where the scalars are taken). To illus-trate this consider the example ofcomplex vector spaceC2 and real vector spaceC2. A vector in C2 is(α + iβ, γ + iδ). The basis for the former vector space is{(1, 0, 0), (0, 1, 0), (0, 0, 1)} and for the latter is{(1, 0, 0), (i, 0, 0), (0, 1, 0), (0, i, 0), (0, 0, 1), (0, 0, i)}. While the former vector space is of dimension three, thelatter has a dimension of six.

4 Subspace

A non-empty subsetM of a vector spaceV is asubspaceif along with every pair,x andy, of vectors containedin M, every linear combinationαx + βy is also contained inM. SinceM is a subset ofV it inherits all theproperties ofV. The0 of V is also contained inM. The subspaceM is itself a vector space.

Remark:The two trivial subspaces of a vector spaceV are

(i) the origin (or the0) only.

(ii) the whole vector spaceV.

Let the dimension of a vector spaceV ben andM is a subspace ofV. Then the dimension ofM is m ≤ n.

4.1 Examples

(i) The real lineR is a subspace of the vector spaceR2. Similarly,R2 is a subspace of the vector spaceR3.

(ii) The polynomial setPn (which contains polynomials of the order< n) is a subspace ofP.

(iii) A set of 3× 3 skew-symmetric matricesS is a subspace of the vector space containing all3× 3 matrices.

5 Linear Transformations

A linear transformation(or operator) A on a vector spaceV is a correspondence that assigns to every vectorx

in V a vectorAx in V, in such a way that

A(αx+ βy) = αAx+ βAy

2

Page 3: Vector Spaces Notes

identically in the vectorsx andy and the scalarsα andβ.

5.1 Examples

• For anyx ∈ Pn, x(t) =∑n−1

j=0ξjt

j, we define(Dx)(t) =∑n−1

j=0jξjt

j−1. HereD is a linear transformationwhich the derivative action.

• For anyx ∈ Pn, x(t) =∑n−1

j=0ξjt

j, we define(Sx)(t) =∑n−1

j=0

ξj

j + 1tj+1. HereS is a linear transforma-

tion which the integration.

6 Transformations as vectors

The set of all linear transformations on a vector space is itself a vector space.

Let A, B andC be three linear transformations on the vector spaceV. It is easy to see that

• (A+B)x = (B + A)x ∈ V.

• (A+ (B + C))x = ((A+B) + C)x.

• (A+ 0)x = Ax = (0 + A)x.

• If we define the transformation(−A) such that(−A)x = −(Ax), thenA + (−A) = 0.

• For any scalarα and any transformationA, we define the productαA as(αA)x = α(Ax). (αA)x ∈ V.(α(βA))x = ((αβ)A)x.

• (1A)x = Ax.

• (α(A+B))x = (αA+ αB)x.

• ((α + β)A)x = (αA+ βA)x.

A few more properties of linear transformations:

• A0 = 0A = 0

• A1 = 1A = A

• A(B + C) = AB + AC

• (A+B)C = AC +BC

• A(BC) = (AB)C

7 Inverses of Transformations

The transformationA is invertibleif and only if

• If x1 6= x2, thenAx1 6= Ax2.

• To every vectory there corresponds (at least) one vectorx such thatAx = y.

3

Page 4: Vector Spaces Notes

The inverse ofA is denoted byA−1. For any invertibleA we have

AA−1 = A−1A = 1

A linear transformationA on a finite-dimensional vector spaceV is invertible if and only ifAx = 0 implies thatx = 0, or, alternatively, if and only if everyy in V can be written in the formy = Ax.

If A andB invertible, thenAB is invertible and(AB)−1 = B−1A−1. If A invertible andα 6= 0, thenαA

is invertible and(αA)−1 =1

αA−1. If A is invertible, thenA−1 is invertible and(A−1)−1 = A.

8 Matrix association with a Linear transformation

Let V be ann−dimensional vector space, letX = {x1, · · · , xn} be any basis ofV, and letA be a lineartransformation onV. Since every vector is a linear combination of thexi, we have in particular

Axj =∑

i

αijxi

for j = 1, · · · , n. The matrix associated with the linear transformationA is denoted by[A] and is

[A] =

α11 α12 · · · α1n

α21 α22 · · · α2n

......

...αn1 αn2 · · · αnn

Note that if the basis changes, the matrix[A] associated with the linear transformationA also changes. Let ussee an example. Consider the differential transformationD on the spacePn, and the basis{x1, · · · , xn} definedby xi = ti−1, i = 1, · · · , n. We have to find the matrix[D] associated with theD. We have

Dx1 = 0x1 + 0x2 + · · ·+ 0xn−1 + 0xn

Dx2 = 1x1 + 0x2 + · · ·+ 0xn−1 + 0xn

Dx3 = 0x1 + 2x2 + · · ·+ 0xn−1 + 0xn

... =...

Dxn = 0x1 + 0x2 + · · ·+ (n− 1)xn−1 + 0xn

So

[D] =

0 1 0 · · · 0 00 0 2 · · · 0 0...

......

......

0 0 0 · · · n− 1 00 0 0 · · · 0 0

In a fixed coordinate systemX = {x1, · · · , xn} and with the knowledge of the matrices ofA andB we have thefollowing:

• If C = αA+ βB, then[C] = (γij) whereγij = αaij + βbij.

• If C = AB, then[C] = (γij) whereγij =∑

k αikβkj.

• If [0] = (oij) and[1] = (eij), thenoij = 0 andeij = δij(Kroneckerdelta).

4

Page 5: Vector Spaces Notes

Let x ∈ V andx =∑

j ξjxj . Let y = Ax. Sincey ∈ V, y =∑

i ηixi. We have to find what is the relationbetweenηi andξj when the matrix[A] is known.

Ax =∑

j

ξjAxj =∑

j

(ξj(∑

i

αijxi)) =∑

i

(∑

j

αijξj)xi

But we havey = Ax =∑

i ηixi. So we haveηi =∑

j αijxij .

9 Invariance

LetV be a vector space of whichM is a subspace.M is invariantunder the linear transformationA if ∀x ∈ M,Ax is also inM. Let the basisX = {x1, · · · , xn} of V be such thatx1, · · · , xm are inM andxm+1, · · · , xn

are not. Form + 1 ≤ j ≤ n, Axj =∑

i αijxi. For 1 ≤ j ≤ m, Axj ∈ M asM is invariant underA. SoAxj =

∑m

i=1αijxi. Hence,αij = 0 for j = 1, · · · , m. Hence, the matrix[A] of A, in this coordinate system will

have the form

[A] =

[

[A1] [B0][0] [A2]

]

where[A1] is the matrix ofA considered as a linear transformation on the spaceM with respect to the coordinatesystem{x1, · · · , xm}, [A2] and[B0] are some arrays of scalars, and[0] denotes the array consisting of zeros only.What are the sizes of these matrices?

10 Change of basis

Let V be ann−dimensional vector space and letX = {x1, · · · , xn} andY = {y1, · · · , yn} be two bases ofV.If x is in V, x =

i ξixi =∑

i ηiyi, what is the relation between its coordinates(ξ1, · · · , ξn) with respect toXand its coordinates(η1, · · · , ηn) with respect toY?

Consider a linear transformationA defined byAxi = yi, i = 1, · · · , n. So

A(∑

i

ξixi) =∑

i

ξiyi.

Let (αij) be the matrix ofA in the basisX , that is,yj = Axj =

i αijxi. Note that for everyy in V there existsa vectorx in V such thatAx = y. Since

i ξiyi = 0 ⇒ ξi = 0, i = 1, · · · , n, soAx = 0 ⇒ x = 0. Hence,A isinvertible.

j

ηjyj =∑

j

ηjAxj =∑

j

ηj∑

i

αijxi

=∑

i

(∑

j

αijηj)xi

Soξi =∑

j αijηj , i = 1, · · · , n. Hence,ξ = [A]η, whereξ = [ξ1, · · · , ξn]T andη = [η1, · · · , ηn]

T .

11 Similarity

First we will try to establish a few relations.

Consider a linear transformationB on then−dimensional vector spaceV. Let X andY be the two bases ofV. Let the matrices associated withB with respect toX andY be (βij) and(γij) respectively. How are these

5

Page 6: Vector Spaces Notes

matrices related to each other?

We have

Bxj =∑

i

βijxi

and

Byj =∑

i

γijyi

Let A be a linear transformation onV defined byAxi = yi, i = 1, · · · , n. Now we have

Byj = BAxj = B(∑

k

αkjxk)

=∑

k

αkjBxk =∑

k

αkj

i

βikxi

=∑

i

(∑

k

βikαkj)xi.

Also

Byj =∑

k

γkjyk =∑

k

γkjAxk

=∑

k

γkj∑

i

αikxi =∑

i

(∑

k

αikγkj)xi.

Comparing the equations we get

k

αikγkj =∑

k

βikαkj.

Hence, we have

(αij)(γij) = (βij)(αij)

⇒(γij) = (αij)−1(βij)(αij)

Two matrices(βij) and(γij) are said to be similar if there exists an invertible matrix(αij) such that

(γij) = (αij)−1(βij)(αij)

Next we look into another case.

Let B andC be two linear transformations on the vector spaceV. Let X andY be two bases ofV. Let thematrix ofB with respect toX be(βij). Let the matrix ofC with respect toY be(βij)? How are the two lineartransformations related?

Note that there are two liner transformations and the matrixof both transformations with respect to two dif-ferent bases are same. So we haveBxj =

i βijxj andCyj =∑

i βijyi. Consider an invertible transformationA onV defined byAxi = yi, i = 1, · · · , n. We have

Cyj = CAxj

6

Page 7: Vector Spaces Notes

and

Cyj =∑

i

βijyi =∑

i

βijAxi = A(∑

i

βijxi) = ABxj

So, we have

CA = AB

⇒ C = ABA−1

Two linear transformationsB andC are said to be similar if there exists an invertible transformationA such thatC = ABA−1.

x u

y v

A

B

C

A

7