Manifold Learning Using Geodesic Entropic Graphs - PowerPoint PPT Presentation

About This Presentation
Title:

Manifold Learning Using Geodesic Entropic Graphs

Description:

From LS fit find: Intrinsic dimension estimate. Alpha-entropy estimate ( ) Ground truth: ... of entropic graphs,' IEEE Signal Processing Magazine, Sept 2002. ... – PowerPoint PPT presentation

Number of Views:210
Avg rating:3.0/5.0
Slides: 21
Provided by: Alfre82
Category:

less

Transcript and Presenter's Notes

Title: Manifold Learning Using Geodesic Entropic Graphs


1
Manifold Learning Using Geodesic Entropic Graphs
  • Alfred O. Hero and Jose Costa
  • Dept. EECS, Dept Biomed. Eng., Dept. Statistics
  • University of Michigan - Ann Arbor
    hero_at_eecs.umich.edu
  • http//www.eecs.umich.edu/hero

Research supported in part by ARO-DARPA MURI
DAAD19-02-1-0262
  1. Manifold Learning and Dimension Reduction
  2. Entropic Graphs
  3. Examples

2
1.Dimension Reduction and Pattern Matching
  • 128x128 images of three vehicles over 1 deg
    increments of 360 deg azimuth at 0 deg elevation
  • The 3(360)1080 images evolve on a lower
    dimensional imbedded manifold in R(16384)

HMMV
T62
Truck
Courtesy of Center for Imaging Science, JHU
3
Land Vehicle Image Manifold
Quantities Of Interest
Embediing (extrinsic) Dimension D
Manifold (intrinsic) Dimension d
Entropy
4
Sampling on a Domain Manifold
2dim manifold
Embedding
Sampling distribution
Sampling
A statistical sample
5
Background on Manifold Learning
  1. Manifold intrinsic dimension estimation
  2. Local KLE, Fukunaga, Olsen (1971)
  3. Nearest neighbor algorithm, Pettis, Bailey, Jain,
    Dubes (1971)
  4. Fractal measures, Camastra and Vinciarelli (2002)
  5. Packing numbers, Kegl (2002)
  6. Manifold Reconstruction
  7. Isomap-MDS, Tenenbaum, de Silva, Langford (2000)
  8. Locally Linear Embeddings (LLE), Roweiss, Saul
    (2000)
  9. Laplacian eigenmaps (LE), Belkin, Niyogi (2002)
  10. Hessian eigenmaps (HE), Grimes, Donoho (2003)
  11. Characterization of sampling distributions on
    manifolds
  12. Statistics of directional data, Watson (1956),
    Mardia (1972)
  13. Data compression on 3D surfaces, Kolarov, Lynch
    (1997)
  14. Statistics of shape, Kendall (1984), Kent, Mardia
    (2001)

6
2. Entropic GraphsA Planar Sample and its
Euclidean MST
7
MST and Geodesic MST
  • For a set of points
    in D-dimensional Euclidean space, the
    Euclidean MST with edge power weighting gamma is
    defined as
  • edge lengths of a spanning tree
    over
  • When pairwise distances are geodesic
    distances on obtain Geodesic MST
  • For dense samplings GMST length MST length

8
Convergence of Euclidean MST
Beardwood, Halton, Hammersley Theorem
9
Convergence Theorem for GMST
Ref CostaHeroTSP2003
10
Special Cases
  • Isometric embedding ( distance
    preserving)
  • Conformal embedding ( angle preserving)

11
Joint Estimation Algorithm
  • Convergence theorem suggests log-linear model
  • Use bootstrap resampling to estimate mean MST
    length and apply LS to jointly estimate slope and
    intercept from sequence
  • Extract d and H from slope and intercept

12
3. ExamplesRandom Samples on the Swiss Roll
  • Ref Tenenbaumetal (2000)

13
Bootstrap Estimates of GMST Length
Bootstrap SE bar (83 CI)
14
loglogLinear Fit to GMST Length
15
Dimension and Entropy Estimates
  • From LS fit find
  • Intrinsic dimension estimate
  • Alpha-entropy estimate (
    )
  • Ground truth

16
Dimension Estimation Comparisons
17
Application to Faces
  • Yale face database 2
  • Photographic folios of many peoples faces
  • Each face folio contains images at 585 different
    illumination/pose conditions
  • Subsampled to 64 by 64 pixels (4096 extrinsic
    dimensions)
  • Objective determine intrinsic dimension and
    entropy of a typical face folio

18
GMST for 3 Face Folios
Ref CostaHero 2003
19
Conclusions
Advantages of Geodesic Entropic Graph Methods
  • Characterizing high dimension sampling
    distributions
  • Standard techniques (histogram, density
    estimation) fail due to curse of dimensionality
  • Entropic graphs can be used to construct
    consistent estimators of entropy and information
    divergence
  • Robustification to outliers via pruning
  • Manifold learning and model reduction
  • LLE, LE, HE estimate d by finding local linear
    representation of manifold
  • Entropic graph estimates d from global resampling
  • Computational complexity of MST is only n log n

20
References
  • A. O. Hero, B. Ma, O. Michel and J. D. Gorman,
    Application of entropic graphs, IEEE Signal
    Processing Magazine, Sept 2002.
  • H. Neemuchwala, A.O. Hero and P. Carson,
    Entropic graphs for image registration, to
    appear in European Journal of Signal Processing,
    2003.
  • J. Costa and A. O. Hero, Manifold learning with
    geodesic minimal spanning trees, accepted in
    IEEE T-SP (Special Issue on Machine Learning),
    2004.
  • A. O. Hero, J. Costa and B. Ma, "Convergence
    rates of minimal graphs with random vertices,"
    submitted to IEEE T-IT, March 2001.
  • J. Costa, A. O. Hero and C. Vignat, "On solutions
    to multivariate maximum alpha-entropy Problems",
    in Energy Minimization Methods in Computer Vision
    and Pattern Recognition (EMM-CVPR), Eds. M.
    Figueiredo, R. Rangagaran, J. Zerubia,
    Springer-Verlag, 2003
Write a Comment
User Comments (0)
About PowerShow.com