You are on page 1of 5

MAT247: linear algebra Notes on Chapter 2

Summary of basic formulas: . Coordinates. For a nite dimensional vector space V over a eld F , let = {v1 , . . . , vn } be a basis and write = (v1 . . . vn ) = a row vector with vector entries. Then every vector v V has a unique expression (0.1) v=
i

Kudla

ci vi ,

ci F

(coordinates) and coordinate vector c1 . [v] = . . . cn Then we can write (0.1) as a matrix product: (0.2) c1 . v = (v1 . . . vn ) . = [v] . . cn V F n ,

Thus we a choice of basis for V gives an isomorphism of vector spaces v [v] ,

where the inverse transformation F n V is given by c1 . c = . c. . cn . The matrix for a linear transformation. Let V and W be nite dimensional vector spaces with bases and = {w1 , . . . , wm }, and write = (w1 . . . wm ). Suppose that T : V W is a linear transformation. Write (0.3) T (vj ) =
i

aij wi .

Then the matrix for T with respect to the given bases is [T ] = A = (aij ). The relation (0.3) can be written as a1j a1j . . T (vj ) = (w1 . . . wm ) . = . . . . amj amj
1

Collecting these into a vector, we have a11 . . . . (T (v1 ) . . . T (vn )) = (w1 . . . wm ) . . am1 . . . or simply (0.4) T () = A = [T ] . a1n amn ,

Proposition 1. Suppose that T : V W is a linear transformation and that, for given bases and , relation (0.3) or equivalently (0.4) determines a matrix A = [T ] . Then [T (v)] = A [v] . proof. Applying T to v=
j

cj vj = [v] ,

we have T (v) =
j

cj T (vj ) cj
j i

= T ( [v] ) aij wi = T () [v] = ( A) [v] = (A [v] ).

= =
i

aij cj wi
j

= (w1 . . . wm ) (A c)

Here the rst column is the more traditional calculation and uses (0.3) while the second column is slicker and used (0.4). Note that c = [v] . Of course, they end at the same place. . Composition. Now suppose that we have vector spaces U , V and W with bases , and and linear transformations S : U V and T : V W . We consider their composition
T S

/V

/( W .

Proposition 2. Suppose that A = [T ] and B = [S] . Then [T ] = [T ] [S] = A B, i.e., composition of linear transformations corresponds to multiplication of matrices. proof. I will just give the slick proof. You might want to write out the traditional proof for yourself. The matrices A and B are dened by the relations T () = A and S() = B.

Then the analogous relation for the composition T S is give by (T S)() = T (S()) = T ( B) = T () B = ( A) B = (A B) This shows that [T S] = A B as claimed. . Change of basis. Of course, all of this discussion also applies when V = W and T : V V . For a basis we write [T ] instead of [T ] . If = {v1 , . . . , vn } is another basis for V , we want to describe the relation between the coordinate vectors [v] and [v] and between the matrices [T ] and [T ] . The main relation (cf. p112 in the text) is that (0.5) vj =
i

qij vi .

This denes a change of coordinate matrix Q = (qij ). In vector form, the equivalent relation is q11 . . . q1n . . . (0.6) (v1 . . . vn ) = (v1 . . . vn ) . . . qn1 . . . qnn that is (0.7) = Q.

This equation is just a concise way of describing how the basis vectors of are given as linear combinations of the basis . Proposition 3. If the bases and are related by = Q, then the coordinate vectors of any v V are related by [v] = Q [v] . proof. This is almost immediate: (Q [v] ) = ( Q) [v] = [v] = v, so that, indeed, Q [v] is the coordinate vector of v with respect to . Note that, by either (0.5) or (0.7), Q = [IV ] . Next we want to relate [T ] and [T ] for a linear transformation T : V V .

Proposition 4. If the bases and are related by = Q, then the matrices [T ] and [T ] for T are related by [T ] = Q [T ] Q1 . proof. Again, this should be almost clear, The idea is that [T ] tells us how to compute T on -coordinate vectors, so to compute T on -coordinate vectors we convert! convert to -coord perform T convert back to -coord. These amount to [v] Q1 [v] [T ] Q1 [v] Q [T ] Q1 [v] . [v] [v] ,

[v] [T ] [v] , [v] [v] .

which proves the claim. Important Remark: Suppose that V = F n (column vectors), and that is a basis. We can view = (v1 . . . vn ) Mn (F ), as a n n-matrix whose columns are the basis vectors. Note that the condition det() = 0 is equivalent to the linear independence of the columns. Let be another basis, and again view Mn (F ), with det( ) = 0. Then (0.7) relating the two bases is = Q. So it is easy to nd Q! We simply have Q = 1 . Example: Suppose we have (cf. p. 112) ={ 1 1 , } 1 1 ={ 2 3 , 4 1 }.

Our change of basis relation is then 2 3 4 1 so that Q= 1 1 1 1


1

1 1 Q 1 1 2 3 4 1 3 2 . 1 1

Example: (cf. problem 6. p.117) Suppose that A= 1 2 , 3 4 ={ 1 1 , }. 1 1

nd [LA ] and an invertible matrix Q such that [LA ] = Q1 A Q.

Solution: The transformation LA : R2 R2 has matrix A with respect to the standard basis e. We want its matrix with respect to the basis to be Q1 AQ. So Q is supposed to convert -coordinate vectors to e-coordinate vectors. Thus, by Proposition 3, [v]e = Q [v] so, according to (0.7), we need = e Q. This amounts to 1 1 1 1 Then just compute [LA ] = Q1 AQ. = Q.

You might also like