Recitation: SVD and dimensionality reduction - PowerPoint PPT Presentation

About This Presentation
Title:

Recitation: SVD and dimensionality reduction

Description:

SVD decomposition. X= U x S x VT. U(M x M) U is orthogonal: UTU = I ... Matrix V in the SVD decomposition (X= U S VT ) contains the eigenvectors of XTX. ... – PowerPoint PPT presentation

Number of Views:72
Avg rating:3.0/5.0
Slides: 22
Provided by: scie5
Learn more at: http://www.cs.cmu.edu
Category:

less

Transcript and Presenter's Notes

Title: Recitation: SVD and dimensionality reduction


1
RecitationSVD and dimensionality reduction
Zhenzhen Kou Thursday, April 21, 2005
2
SVD
  • Intuition find the axis that shows the greatest
    variation, and project all points into this axis

f2
e1
e2
f1
3
SVD Mathematical Background
4
SVD The mathematical formulation
  • Let X be the M x N matrix of M N-dimensional
    points
  • SVD decomposition
  • X U x S x VT
  • U(M x M)
  • U is orthogonal UTU I
  • columns of U are the orthogonal eigenvectors of
    XXT
  • called the left singular vectors of X
  • V(N x N)
  • V is orthogonal VTV I
  • columns of V are the orthogonal eigenvectors of
    XTX
  • called the right singular vectors of X
  • S(M x N)
  • diagonal matrix consisting of r non-zero values
    in descending order
  • square root of the eigenvalues of XXT (or XTX)
  • r is the rank of the symmetric matrices
  • called the singular values

5
SVD - Interpretation
6
SVD - Interpretation
  • X U S VT - example

7
SVD - Interpretation
  • X U S VT - example

variance (spread) on the v1 axis
x
x

8
SVD - Interpretation
  • X U S VT - example
  • U L gives the coordinates of the points in
    the projection axis

x
x

9
Dimensionality reduction
  • set the smallest eigenvalues to zero

x
x

10
Dimensionality reduction
x
x

11
Dimensionality reduction
x
x

12
Dimensionality reduction
x
x

13
Dimensionality reduction

14
Dimensionality reduction
  • Equivalent
  • spectral decomposition of the matrix

x
x

15
Dimensionality reduction
  • Equivalent
  • spectral decomposition of the matrix

l1
x
x

u1
u2
l2
v1
v2
16
Dimensionality reduction
  • spectral decomposition of the matrix

m
r terms


...
n
n x 1
1 x m
17
Dimensionality reduction
  • approximation / dim. reduction
  • by keeping the first few terms (Q how many?)

m


...
n
assume l1 gt l2 gt ...
18
Dimensionality reduction
  • A heuristic keep 80-90 of energy ( sum of
    squares of li s)

m


...
n
assume l1 gt l2 gt ...
19
Another example-Eigenface
  • The PCA problem in HW5
  • Face data X
  • Eigenvectors associated with the first few large
    eigenvalues of XXT have face-like images

20
Dimensionality reduction
  • Matrix V in the SVD decomposition
  • (X USVT ) is used to transform the data.
  • XV ( US) defines the transformed dataset.
  • For a new data element x, xV defines the
    transformed data.
  • Keeping the first k (k lt n) dimensions, amounts
    to keeping only the first k columns of V.

21
Principal Components Analysis (PCA)
  • Transfer the dataset to the center by subtracting
    the means let matrix X be the result.
  • Compute the matrix XTX.
  • The covariance matrix except for constants.
  • Project the dataset along a subset of the
    eigenvectors of XTX.
  • Matrix V in the SVD decomposition
  • (X U S VT ) contains the eigenvectors of XTX.
  • Also known as K-L transform.
Write a Comment
User Comments (0)
About PowerShow.com