Title: Manifold learning: Laplacian Eigenmaps
1Manifold learning Laplacian Eigenmaps
Jieping Ye Department of Computer Science and
Engineering Arizona State University http//www.pu
blic.asu.edu/jye02
2Overview
- Isomap and LLE
- Local geometry derived from k-nearest neighbors
- require dense data points on the manifold for
good estimation - Isomap
- Global approach
- Preserve the Geodesic distance
- LLE
- Local approach
- Preserve linear combination weights
3Outline of lecture
- Laplacian Eigenmaps
- Problem definition
- Algorithms
- Justification
- Locality preserving projection (LPP)
4Problem definition
The optimality of low-dimensional embedding will
be clear later.
5Laplacian Eigenmaps
- Laplacian Eigenmaps for Dimensionality Reduction
and Data Representation  - M. Belkin, P. Niyogi
- Neural Computation, June 2003 15 (6)1373-1396.
- Key steps
- Build the adjacency graph
- Choose the weights for edges in the graph
- Eigen-decomposition of the graph laplacian
- Form the low-dimensional embedding
6Step 1 Adjacency graph construction
7Step 2 Choosing the weight
8Steps Eigen-decomposition
9Step 4 Embedding
10Laplacian Eigenmaps and spectral clustering
- They involve the same computations.
- Laplacian Eigenmaps only compute the embedding,
i.e., dimension reduction. - Spectral clustering not only compute the
embedding, but also compute the clustering in the
embedded space.
11Justification
Consider the problem of mapping the graph to a
line so that pairs of points with large
similarity (weight) stay as close as possible.
A reasonable criterion for choosing the mapping
is to minimize
12Justification
13General embedding
14Overview of spectral clustering
15Overview of spectral clustering
16The Laplace Beltrami Operator
17The Laplace Beltrami Operator
discrete continuous
Laplaca Beltrami operator
18Connection with LLE
LLE computes the eigenvectors of
It can be shown under certain conditions that
19An example
20Locality preserving projection
- Face recognition
- Laplacianface (LPP)
- Eigenface (PCA)
- Fisherface (LDA)
- PCA and LDA applies global dimension reduction.
- LPP aims to preserve local structure of the data.
- Apply key idea from laplacian eigenmaps
21LPP
22LPP
23Laplacian Eigenmaps versus LPP
- Apply the similar idea for computing
low-dimensional representation - Laplacian Eigenmaps does not form explicit
transformation - LPP computes explicit linear transformation
24Next class
- Topics
- Nystroms method
- A unified view of manifold learning
- Readings
- Geometric Methods for Feature Extraction and
Dimensional Reduction - http//www.public.asu.edu/jye02/CLASSES/Fall-2005
/PAPERS/Burge-featureextraction.pdf - Out-of-Sample Extensions for LLE, Isomap, MDS,
Eigenmaps, and Spectral Clustering - http//www.iro.umontreal.ca/lisa/pointeurs/bengio
_extension_nips_2003.pdf