You are on page 1of 69

Outline SVD Review PCA from SVD

From SVD to PCA


Xin Li

April 6, 2009

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

SVD Review The statement of SVD A closer look at SVD

PCA from SVD Principal components PCA and SVD in Matlab Application to face recognition

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Recall the SVD


Given a matrix Xmn of rank n.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Recall the SVD


Given a matrix Xmn of rank n. We can nd orthogonal matrices Umn and Vnn ,

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Recall the SVD


Given a matrix Xmn of rank n. We can nd orthogonal matrices Umn and Vnn , and diagonal matrix Dnn : X = UDVT

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Recall the SVD


Given a matrix Xmn of rank n. We can nd orthogonal matrices Umn and Vnn , and diagonal matrix Dnn : X = UDVT Visualize it as: X
mn

nn

VT

nn

= U
mn

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Recall the SVD


Given a matrix Xmn of rank n. We can nd orthogonal matrices Umn and Vnn , and diagonal matrix Dnn : X = UDVT Visualize it as: X
mn

nn

VT

nn

= U
mn

Now we drive SVD when the rank of X is r < n.


Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD


Find the eigen decomposition of XT X: XT X = VVT where is the diagonal matrix with the eigenvalues i s of XT X on its diagonal.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD


Find the eigen decomposition of XT X: XT X = VVT where is the diagonal matrix with the eigenvalues i s of XT X on its diagonal. The matrix V is orthogonal whose columns vi s are eigenvectors of XT X: XT Xvi = i vi , i = 1 : n

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD


Find the eigen decomposition of XT X: XT X = VVT where is the diagonal matrix with the eigenvalues i s of XT X on its diagonal. The matrix V is orthogonal whose columns vi s are eigenvectors of XT X: XT Xvi = i vi , i = 1 : n It is also known that i 0 (Homework: Verify it!). If the rank of X is r (r n), there are exactly r positive eigenvalues, say 1 , 2 , ..., r

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD


Find the eigen decomposition of XT X: XT X = VVT where is the diagonal matrix with the eigenvalues i s of XT X on its diagonal. The matrix V is orthogonal whose columns vi s are eigenvectors of XT X: XT Xvi = i vi , i = 1 : n It is also known that i 0 (Homework: Verify it!). If the rank of X is r (r n), there are exactly r positive eigenvalues, say 1 , 2 , ..., r For i = 1 : r , dene i = i and ui = X vi /i .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD


Find the eigen decomposition of XT X: XT X = VVT where is the diagonal matrix with the eigenvalues i s of XT X on its diagonal. The matrix V is orthogonal whose columns vi s are eigenvectors of XT X: XT Xvi = i vi , i = 1 : n It is also known that i 0 (Homework: Verify it!). If the rank of X is r (r n), there are exactly r positive eigenvalues, say 1 , 2 , ..., r For i = 1 : r , dene i = i and ui = X vi /i . Then XT ui = i vi
Xin Li

Xvi = i ui

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Put in matrix notation: X[v1 v2 vr ] = [1 u1 2 u2 r ur ] XT [u1 u2 ur ] = [1 v1 2 v2 r vr ]

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Put in matrix notation: X[v1 v2 vr ] = [1 u1 2 u2 r ur ] XT [u1 u2 ur ] = [1 v1 2 v2 r vr ] Note that {v1 , v2 , , vr } is an orthogonal set in Rn . We can extend it to an orthogonal basis.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Put in matrix notation: X[v1 v2 vr ] = [1 u1 2 u2 r ur ] XT [u1 u2 ur ] = [1 v1 2 v2 r vr ] Note that {v1 , v2 , , vr } is an orthogonal set in Rn . We can extend it to an orthogonal basis. Solve [v1 v2 vr ]v = 0 (What is the dimension of the solution space of this homogeneous system?)

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Put in matrix notation: X[v1 v2 vr ] = [1 u1 2 u2 r ur ] XT [u1 u2 ur ] = [1 v1 2 v2 r vr ] Note that {v1 , v2 , , vr } is an orthogonal set in Rn . We can extend it to an orthogonal basis. Solve [v1 v2 vr ]v = 0 (What is the dimension of the solution space of this homogeneous system?) Answer: Since the coecient matrix V1 := [v1 v2 vr ] has rank r (Why?), the dimension of the solution space is n r .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Put in matrix notation: X[v1 v2 vr ] = [1 u1 2 u2 r ur ] XT [u1 u2 ur ] = [1 v1 2 v2 r vr ] Note that {v1 , v2 , , vr } is an orthogonal set in Rn . We can extend it to an orthogonal basis. Solve [v1 v2 vr ]v = 0 (What is the dimension of the solution space of this homogeneous system?) Answer: Since the coecient matrix V1 := [v1 v2 vr ] has rank r (Why?), the dimension of the solution space is n r . So, there are exactly n r orthogonal vectors vr +1 , vr +2 , ... , vn such that v1 , v2 , ..., vn form an orthogonal basis of Rn .
Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Use the extended basis: X[v1 v2 vn ] = [1 u1 2 u2 r ur Xvr +1 Xvn ]

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Use the extended basis: X[v1 v2 vn ] = [1 u1 2 u2 r ur Xvr +1 Xvn ] Claim: Xvj = 0, j = r + 1 : n (Homework: Verify it!).

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Use the extended basis: X[v1 v2 vn ] = [1 u1 2 u2 r ur Xvr +1 Xvn ] Claim: Xvj = 0, j = r + 1 : n (Homework: Verify it!). Thus, X[v1 v2 vn ] = [1 u1 2 u2 r ur 0 0]

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Use the extended basis: X[v1 v2 vn ] = [1 u1 2 u2 r ur Xvr +1 Xvn ] Claim: Xvj = 0, j = r + 1 : n (Homework: Verify it!). Thus, X[v1 v2 vn ] = [1 u1 2 u2 r ur 0 0] Let V := [v1 v2 vn ] and D = diag(1 , ..., r ).

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued


Use the extended basis: X[v1 v2 vn ] = [1 u1 2 u2 r ur Xvr +1 Xvn ] Claim: Xvj = 0, j = r + 1 : n (Homework: Verify it!). Thus, X[v1 v2 vn ] = [1 u1 2 u2 r ur 0 0] Let V := [v1 v2 vn ] and D = diag(1 , ..., r ). . So, XV = [u1 u2 ur ][D . r (nr ) ]. Thus .0 . X = [u1 u2 ur ][D . r (nr ) ]VT .0

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued

Finally, to make the ui matrix a square matrix, we extend it an orthogonal basis: u1 , u2 , ..., um .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued

Finally, to make the ui matrix a square matrix, we extend it an orthogonal basis: u1 , u2 , ..., um . . Dene U := [u1 u2 um ] and ll 0s under [D . r (nr ) ]: .0 X=U . D . r (nr ) .0 0mr n VT

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Deriving the SVD, continued

Finally, to make the ui matrix a square matrix, we extend it an orthogonal basis: u1 , u2 , ..., um . . Dene U := [u1 u2 um ] and ll 0s under [D . r (nr ) ]: .0 X=U We are done! . D . r (nr ) .0 0mr n VT

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Back to column vector form


Put U and V back to column vector form in SVD: . . X = U D .0r (nr ) VT 0mr n

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Back to column vector form


Put U and V back to column vector form in SVD: . . X = U D .0r (nr ) VT 0mr n we have X = [u1 um ] . D . r (nr ) .0 0mr n
T v1 T v2 . . . T vn

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

The statement of SVD A closer look at SVD

Back to column vector form


Put U and V back to column vector form in SVD: . . X = U D .0r (nr ) VT 0mr n we have X = [u1 um ] . D . r (nr ) .0 0mr n
T v1 T v2 . . . T vn

So, X = [1 u1 r ur 0 0]
Xin Li

T v1 T v2 . . .

=
r

i ui viT
i=1

vT From SVD to n PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn . When n is large, we want reduce the dimension the data.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn . When n is large, we want reduce the dimension the data. How could this be possible?

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn . When n is large, we want reduce the dimension the data. How could this be possible? This is possible in the case that all data vectors are actually coming from a subspace U of small dimension (say, r ).

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn . When n is large, we want reduce the dimension the data. How could this be possible? This is possible in the case that all data vectors are actually coming from a subspace U of small dimension (say, r ). Find an orthogonal set of vectors u1 , ... , ur that forms a basis for U!

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn . When n is large, we want reduce the dimension the data. How could this be possible? This is possible in the case that all data vectors are actually coming from a subspace U of small dimension (say, r ). Find an orthogonal set of vectors u1 , ... , ur that forms a basis for U! If we can do so, then ai = fi1 u1 + fi2 u2 + + fir ur , i = 1 : m for some coecients fij s.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Dimension reduction
Assume we have a collection data of column vectors a1 , a2 , ..., am of Rn . When n is large, we want reduce the dimension the data. How could this be possible? This is possible in the case that all data vectors are actually coming from a subspace U of small dimension (say, r ). Find an orthogonal set of vectors u1 , ... , ur that forms a basis for U! If we can do so, then ai = fi1 u1 + fi2 u2 + + fir ur , i = 1 : m for some coecients fij s. Note that fij = aT uj , the projection of ai along the direction i of uj .
Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

How to nd the rst principal component?


Let us nd u1 .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

How to nd the rst principal component?


Let us nd u1 . A good choice of u1 : the data vectors have the greatest total projection along the direction of u1 :
m

max
w =1 i=1

|aT w|2 i

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

How to nd the rst principal component?


Let us nd u1 . A good choice of u1 : the data vectors have the greatest total projection along the direction of u1 :
m

max
w =1 i=1

|aT w|2 i

Note that

|aT w|2 = AT w i
i=1 T

= (A w)T (AT w) = wT AAT w

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

How to nd the rst principal component?


Let us nd u1 . A good choice of u1 : the data vectors have the greatest total projection along the direction of u1 :
m

max
w =1 i=1

|aT w|2 i

Note that

|aT w|2 = AT w i
i=1 T

= (A w)T (AT w) = wT AAT w So,


m

max
w =1 i=1

|ai w|2 = max wT (AAT )w = max


w =1
Xin Li

w =0

wT (AAT )w wT w

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Here comes the SVD


Let us use SVD of A: A = UDVT

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Here comes the SVD


Let us use SVD of A: A = UDVT Then, AAT = UDVT VDT UT = UDDT UT

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Here comes the SVD


Let us use SVD of A: A = UDVT Then, AAT = UDVT VDT UT = UDDT UT and wT (AAT )w (UT w)T DDT (UT w) = wT w (UT w)T (UT w)

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Here comes the SVD


Let us use SVD of A: A = UDVT Then, AAT = UDVT VDT UT = UDDT UT and wT (AAT )w (UT w)T DDT (UT w) = wT w (UT w)T (UT w)

Let x := UT w and notice that there are only r non-zero entries 1 , ..., r in D.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Here comes the SVD


Let us use SVD of A: A = UDVT Then, AAT = UDVT VDT UT = UDDT UT and wT (AAT )w (UT w)T DDT (UT w) = wT w (UT w)T (UT w)

Let x := UT w and notice that there are only r non-zero entries 1 , ..., r in D. We have
2 2 2 wT (AAT )w 2 x 2 + 2 x2 + r xr2 = 1 1 2 2 2 wT w x1 + x2 + + x m
Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finally...
We have max
w =0 2 2 2 2 x 2 + 2 x2 + r xr2 wT (AAT )w = max 1 1 2 2 2 x=0 wT w x1 + x 2 + + xm

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finally...
We have max
w =0 2 2 2 2 x 2 + 2 x2 + r xr2 wT (AAT )w = max 1 1 2 2 2 x=0 wT w x1 + x 2 + + xm

Assume 1 2 r . Then max


x=0 2 2 2 2 2 1 x1 + 2 x2 + r xr2 2 = 1 = 1 2 + x2 + + x2 x1 m 2

This is the largest eigenvalue of A.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finally...
We have max
w =0 2 2 2 2 x 2 + 2 x2 + r xr2 wT (AAT )w = max 1 1 2 2 2 x=0 wT w x1 + x 2 + + xm

Assume 1 2 r . Then max


x=0 2 2 2 2 2 1 x1 + 2 x2 + r xr2 2 = 1 = 1 2 + x2 + + x2 x1 m 2

This is the largest eigenvalue of A. A vector x that makes the maximum is x1 = 1 and xi = 0 for i =2:m

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finally...
We have max
w =0 2 2 2 2 x 2 + 2 x2 + r xr2 wT (AAT )w = max 1 1 2 2 2 x=0 wT w x1 + x 2 + + xm

Assume 1 2 r . Then max


x=0 2 2 2 2 2 1 x1 + 2 x2 + r xr2 2 = 1 = 1 2 + x2 + + x2 x1 m 2

This is the largest eigenvalue of A. A vector x that makes the maximum is x1 = 1 and xi = 0 for i =2:m This corresponds to w = Ux = u1 .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finally...
We have max
w =0 2 2 2 2 x 2 + 2 x2 + r xr2 wT (AAT )w = max 1 1 2 2 2 x=0 wT w x1 + x 2 + + xm

Assume 1 2 r . Then max


x=0 2 2 2 2 2 1 x1 + 2 x2 + r xr2 2 = 1 = 1 2 + x2 + + x2 x1 m 2

This is the largest eigenvalue of A. A vector x that makes the maximum is x1 = 1 and xi = 0 for i =2:m This corresponds to w = Ux = u1 . So, the rst principal component is indeed achieved by the rst eigenvector u1 of AAT
Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

How about the 2nd, 3rd, ... principal components?


Using the same idea, the second principal component is the direction: be orthogonal to the rst and maximize the total projection:
m w =1,w T u1 =0

max

|aT w|2 i
i=1

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

How about the 2nd, 3rd, ... principal components?


Using the same idea, the second principal component is the direction: be orthogonal to the rst and maximize the total projection:
m w =1,w T u1 =0

max

|aT w|2 i
i=1

This is the same to


2 2 2 2 2 wT (AAT )w 1 x1 + 2 x2 + r xr2 = max 2 2 2 wT w x 1 + x2 + + x m w =0,w T u1 =0 x=0,x T UT u1 =0

max

2 2 2 2 2 1 x1 + 2 x2 + r xr2 2 = 2 = 2 2 + x2 + + x2 x=0,x1 =0 x1 m 2

max

This is the largest eigenvalue of AAT .


Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finding the 2nd, 3rd, ... principal components

A vector x that makes the maximum is x2 = 1 and xi = 0 for i = 1, 3 : m

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finding the 2nd, 3rd, ... principal components

A vector x that makes the maximum is x2 = 1 and xi = 0 for i = 1, 3 : m This corresponds to w = Ux = u2 .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finding the 2nd, 3rd, ... principal components

A vector x that makes the maximum is x2 = 1 and xi = 0 for i = 1, 3 : m This corresponds to w = Ux = u2 . So, the second principal component is indeed achieved by the second eigenvector u2 of AAT .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Finding the 2nd, 3rd, ... principal components

A vector x that makes the maximum is x2 = 1 and xi = 0 for i = 1, 3 : m This corresponds to w = Ux = u2 . So, the second principal component is indeed achieved by the second eigenvector u2 of AAT . Now you see the pattern: ui is the ith principal component.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension).

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10);

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10); We want to nd the rst principal component.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10); We want to nd the rst principal component. First, we center the data by remove the mean of all points.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10); We want to nd the rst principal component. First, we center the data by remove the mean of all points. x=x-mean(x);y=y-mean(y);z=z-mean(z);hold on;scatter3(x,y,z,3*ones(1,p),1:p);

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10); We want to nd the rst principal component. First, we center the data by remove the mean of all points. x=x-mean(x);y=y-mean(y);z=z-mean(z);hold on;scatter3(x,y,z,3*ones(1,p),1:p); Now, we use PCA to nd the principal direction of these points

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10); We want to nd the rst principal component. First, we center the data by remove the mean of all points. x=x-mean(x);y=y-mean(y);z=z-mean(z);hold on;scatter3(x,y,z,3*ones(1,p),1:p); Now, we use PCA to nd the principal direction of these points A=[x;y;z];[U,S,V]=svds(A);scatter3(x,y,z,3*ones(1,p),U(:,1)*A);

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Sample code
Let us randomly choose 20 points on a line in space (3-dimension). p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),1:p);colorbar;view(-30,10); We want to nd the rst principal component. First, we center the data by remove the mean of all points. x=x-mean(x);y=y-mean(y);z=z-mean(z);hold on;scatter3(x,y,z,3*ones(1,p),1:p); Now, we use PCA to nd the principal direction of these points A=[x;y;z];[U,S,V]=svds(A);scatter3(x,y,z,3*ones(1,p),U(:,1)*A); Color is according to x-projection
Xin Li From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

The m-le and plot


p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),2*[1:p]/p-1);view(-30,10); x=x-mean(x);y=y-mean(y);z=z-mean(z); A=[x;y;z];[U,S,V]=svds(A); hold on; scatter3(x,y,z,3*ones(1,p),U(:,1)*A);

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

The m-le and plot


p=100;r=rand(1,p);x=2*r-1;y=r+2;z=3-r; scatter3(x,y,z,3*ones(1,p),2*[1:p]/p-1);view(-30,10); x=x-mean(x);y=y-mean(y);z=z-mean(z); A=[x;y;z];[U,S,V]=svds(A); hold on; scatter3(x,y,z,3*ones(1,p),U(:,1)*A);

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Images as column vectors


Assume we have a set of n images of the same size (say, p q): I1 , I2 , ... , In .

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Images as column vectors


Assume we have a set of n images of the same size (say, p q): I1 , I2 , ... , In . We transform the p q image matrix Ii to a column vector of size m := pq by stacking its columns. For simplicity, we use the same notation to denote the column vector.

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Images as column vectors


Assume we have a set of n images of the same size (say, p q): I1 , I2 , ... , In . We transform the p q image matrix Ii to a column vector of size m := pq by stacking its columns. For simplicity, we use the same notation to denote the column vector. Now, as a standard step, we take out the mean image 1 I := n n Ii from every image: i=1 i := Ii I, i = 1 : n I This step guarantees i are centered at the origin (and hence I contained in a true subspace).

Xin Li

From SVD to PCA

Outline SVD Review PCA from SVD

Principal components PCA and SVD in Matlab Application to face recognition

Images as column vectors


Assume we have a set of n images of the same size (say, p q): I1 , I2 , ... , In . We transform the p q image matrix Ii to a column vector of size m := pq by stacking its columns. For simplicity, we use the same notation to denote the column vector. Now, as a standard step, we take out the mean image 1 I := n n Ii from every image: i=1 i := Ii I, i = 1 : n I This step guarantees i are centered at the origin (and hence I contained in a true subspace). Form the m n data matrix X := [1 2 n ] I I I
Xin Li From SVD to PCA

You might also like