PCA Extension - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

PCA Extension

Description:

PCA is Least-Square Fit. PCA is Least-Square Fit. Robust Statistics. Recover the best fit for the majority of the data. Detect and reject outliers. Robust PCA ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 46
Provided by: Jon86
Category:
Tags: pca | extension

less

Transcript and Presenter's Notes

Title: PCA Extension


1
PCA Extension
  • By Jonash

2
Outline
  • Robust PCA
  • Generalized PCA
  • Clustering points on a line
  • Clustering lines on a plane
  • Clustering hyperplanes in a space

3
Robust PCA
  • Rrbust Principal Component Analysis for Computer
    Vision
  • Fernando De la Torre
  • Mochael J. Black
  • CS, Brown University

4
PCA is Least-Square Fit
5
PCA is Least-Square Fit
6
Robust Statistics
  • Recover the best fit for the majority of the data
  • Detect and reject outliers

7
Robust PCA
8
Robust PCA
9
Robust PCA
  • Training images

10
Robust PCA
  • Naïve PCA
  • Simply reject
  • Robust PCA

11
RPCA
  • In traditional PCA, we minimize
  • Sni 0 (di B BT di)2 Sni 0 (di - Bci)2
  • EM PCA Lims -gt 0(D BC s2I)
  • E-step C (BTB)-1BTD
  • M-step B DCT(CCT)-1

BBTdi
di
B
BTdi ci
12
RPCA
  • Xu and Yuille 1995 tries to minimize
  • Sni 1 V i(di B ci)2 n(1-Vi)
  • Hard to solve (continuous discrete)

13
RPCA
  • Gabriel and Zamir 1979 tries to minimize
  • Sni 1 Sdp 1 wpi(dpi Bci)2
  • Impratical for high dimension
  • Low rank approximation of matrices by least
    squares with any choice of weights 1979

14
RPCA
  • Idea is to use a robust function ?
  • Geman-McClure? (x,s) x2/(x2 s2)
  • Sni 1 Sdp 1 ? (dpiµp Skj 1 bpjcji), sp
  • Approximated by local quadratic function
  • Use gradient descent
  • The rest is nothing but heuristics

15
RPCA
16
Robust PCA - Experiment
  • 256 training images (120x160)
  • Obtain 20 RPCA basis
  • 3 hrs on 900MHz Pentinum III in Matlab

17
Outline
  • Robust PCA
  • Generalized PCA
  • Clustering points on a line
  • Clustering lines on a plane
  • Clustering hyperplanes in a space

18
Generalized PCA
  • Generalized Principal Component Analysis
  • Rene Vidal
  • Yi Ma
  • Shankar Sastry
  • UC Berkley and UIUC

19
GPCA
20
GPCA Example 1
21
GPCA Example 2
22
GPCA Example 3
23
GPCA Goals
  1. of subspaces and their dimension
  2. Basis for subspace
  3. Segmentation of data

24
GPCA Ideas
  • Union of subspaces certain polynomials

25
Outline
  • Robust PCA
  • Generalized PCA
  • Clustering points on a line
  • Clustering lines on a plane
  • Clustering hyperplanes in a space

26
GPCA 1D Case
27
GPCA 1D Case Contd
28
GPCA 1D Case Contd
MN n1
unknowns
To have a unique solution, rank(Vn) n Mn- 1
29
GPCA 1D Example
  • n 2 groups
  • pn(x) ( x µ1) ( x µ2)
  • No polynomial of degree 1
  • Infinite polynomial of degree 3
  • pn(x) x2 c1x c2 gt Polynomial factor

30
Outline
  • Robust PCA
  • Generalized PCA
  • Clustering points on a line
  • Clustering lines on a plane
  • Clustering hyperplanes in a space

31
GPCA 2D Case
  • L j X x, yT bj1x bj2y 0
  • (b11x b12y 0) or (b21x b22y 0)

32
GPCA 2D Case Contd
  • (b11x b12y 0) or (b21x b22y 0)
  • Pn(x) (b11x b12y)(bn1x bn2y) 0
  • Sck xn-k yk

33
GPCA 2D Case Contd
  • Take n 2 for example
  • p2(x) (b11x b12y)(b21x b22y)
  • ?p2(x) (b21x b22y)b1 (b11x b12y)b2
  • , bj bj1, bj2 T
  • if x L1, then ?p2(x) b1, otherwise b2

34
GPCA 2D Case Contd
  • Given that yj e Lj, the normal vector of Lj is
    bj ?pn(yj)
  • 3 things
  • Determine n as min j rank(Vj) j
  • Solve cn for Vncn 0
  • Find normal vector bj

35
Outline
  • Robust PCA
  • Generalized PCA
  • Clustering points on a line
  • Clustering lines on a plane
  • Clustering hyperplanes in a space

36
GPCA Hyperplanes
  • Still assume d1 dn d D 1
  • Sj bjTx bj1x1 bj2x2 bjDxD 0

37
GPCA Hyperplanes
MN C(Dn-1, D)
38
GPCA Hyperplanes
39
GPCA Hyperplanes
  • Since we know n, we can solve for ck
  • ck gt bk by ?pn(x)
  • If we know yj on each Sj, finding bj will be easy

40
GPCA Hyperplanes
  • One point yj on each hyperplane Sj
  • Consider a random line L t v x0
  • Obtain yj by intersecting L and Sj
  • yj tj v x0
  • Find roots tj by Pn(t v xo)

41
GPCA Hyperplanes
  • Summarize
  • We want to find n to solve for c
  • To get b (normal) for each S, find ?pn(x)
  • To get label j, solve pn(yj tj v x0) 0

42
(No Transcript)
43
  • One More Thing

44
One More Thing
  • Previously we assume d1 dn D 1
  • Actually we cannot assume that
  • Please read section 4.2 4.3 by yourself
  • Discuss how to recursively reduce dimension

45
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com