Manifold learning: Laplacian Eigenmaps - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

Manifold learning: Laplacian Eigenmaps

Description:

Local geometry derived from k-nearest neighbors ... M. Belkin, P. Niyogi. Neural Computation, June 2003; 15 (6):1373-1396. Key steps ... – PowerPoint PPT presentation

Number of Views:624
Avg rating:3.0/5.0
Slides: 25
Provided by: publi6
Category:

less

Transcript and Presenter's Notes

Title: Manifold learning: Laplacian Eigenmaps


1
Manifold learning Laplacian Eigenmaps
Jieping Ye Department of Computer Science and
Engineering Arizona State University http//www.pu
blic.asu.edu/jye02
2
Overview
  • Isomap and LLE
  • Local geometry derived from k-nearest neighbors
  • require dense data points on the manifold for
    good estimation
  • Isomap
  • Global approach
  • Preserve the Geodesic distance
  • LLE
  • Local approach
  • Preserve linear combination weights

3
Outline of lecture
  • Laplacian Eigenmaps
  • Problem definition
  • Algorithms
  • Justification
  • Locality preserving projection (LPP)

4
Problem definition
The optimality of low-dimensional embedding will
be clear later.
5
Laplacian Eigenmaps
  • Laplacian Eigenmaps for Dimensionality Reduction
    and Data Representation  
  • M. Belkin, P. Niyogi
  • Neural Computation, June 2003 15 (6)1373-1396.
  • Key steps
  • Build the adjacency graph
  • Choose the weights for edges in the graph
  • Eigen-decomposition of the graph laplacian
  • Form the low-dimensional embedding

6
Step 1 Adjacency graph construction
7
Step 2 Choosing the weight
8
Steps Eigen-decomposition
9
Step 4 Embedding
10
Laplacian Eigenmaps and spectral clustering
  • They involve the same computations.
  • Laplacian Eigenmaps only compute the embedding,
    i.e., dimension reduction.
  • Spectral clustering not only compute the
    embedding, but also compute the clustering in the
    embedded space.

11
Justification
Consider the problem of mapping the graph to a
line so that pairs of points with large
similarity (weight) stay as close as possible.
A reasonable criterion for choosing the mapping
is to minimize
12
Justification
13
General embedding
14
Overview of spectral clustering
15
Overview of spectral clustering
16
The Laplace Beltrami Operator
17
The Laplace Beltrami Operator
discrete continuous
Laplaca Beltrami operator
18
Connection with LLE
LLE computes the eigenvectors of
It can be shown under certain conditions that
19
An example
20
Locality preserving projection
  • Face recognition
  • Laplacianface (LPP)
  • Eigenface (PCA)
  • Fisherface (LDA)
  • PCA and LDA applies global dimension reduction.
  • LPP aims to preserve local structure of the data.
  • Apply key idea from laplacian eigenmaps

21
LPP
22
LPP
23
Laplacian Eigenmaps versus LPP
  • Apply the similar idea for computing
    low-dimensional representation
  • Laplacian Eigenmaps does not form explicit
    transformation
  • LPP computes explicit linear transformation

24
Next class
  • Topics
  • Nystroms method
  • A unified view of manifold learning
  • Readings
  • Geometric Methods for Feature Extraction and
    Dimensional Reduction
  • http//www.public.asu.edu/jye02/CLASSES/Fall-2005
    /PAPERS/Burge-featureextraction.pdf
  • Out-of-Sample Extensions for LLE, Isomap, MDS,
    Eigenmaps, and Spectral Clustering
  • http//www.iro.umontreal.ca/lisa/pointeurs/bengio
    _extension_nips_2003.pdf
Write a Comment
User Comments (0)
About PowerShow.com