Image Classification using Sparse Coding: Advanced Topics - PowerPoint PPT Presentation

About This Presentation
Title:

Image Classification using Sparse Coding: Advanced Topics

Description:

Part 3: Image Classification using Sparse Coding: Advanced Topics Kai Yu Dept. of Media Analytics NEC Laboratories America Andrew Ng Computer Science Dept. – PowerPoint PPT presentation

Number of Views:387
Avg rating:3.0/5.0
Slides: 42
Provided by: rogert1
Category:

less

Transcript and Presenter's Notes

Title: Image Classification using Sparse Coding: Advanced Topics


1
Part 3 Image Classification using Sparse
Coding Advanced Topics
Kai Yu Dept. of Media Analytics NEC Laboratories
America
Andrew Ng Computer Science Dept. Stanford
University
2
Outline of Part 3
  • Why can sparse coding learn good features?
  • Intuition, topic model view, and geometric view
  • A theoretical framework local coordinate coding
  • Two practical coding methods
  • Recent advances in sparse coding for image
    classification

3
Outline of Part 3
  • Why can sparse coding learn good features?
  • Intuition, topic model view, and geometric view
  • A theoretical framework local coordinate coding
  • Two practical coding methods
  • Recent advances in sparse coding for image
    classification

4
Intuition why sparse coding helps classification?
Figure from http//www.dtreg.com/svm.htm
  • The coding is a nonlinear feature mapping
  • Represent data in a higher dimensional space
  • Sparsity makes prominent patterns more
    distinctive

5
A topic model view to sparse coding
  • Each basis is a direction or a topic.
  • Sparsity each datum is a linear combination of
    only a few bases.
  • Applicable to image denoising, inpainting, and
    super-resolution.

6
A geometric view to sparse coding
Data manifold
  • Each basis is somewhat like a pseudo data point
    anchor point
  • Sparsity each datum is a sparse combination of
    neighbor anchors.
  • The coding scheme explores the manifold
    structure of data.

7
MNIST Experiment Classification using SC
  • 60K training, 10K for test
  • Let k512
  • Linear SVM on sparse codes

8
MNIST Experiment Lambda 0.0005
Each basis is like a part or direction.
9
MNIST Experiment Lambda 0.005
Again, each basis is like a part or direction.
10
MNIST Experiment Lambda 0.05
Now, each basis is more like a digit !
11
MNIST Experiment Lambda 0.5
Like clustering now!
12
Geometric view of sparse coding
Error 4.54
Error 3.75
Error 2.64
  • When SC achieves the best classification
    accuracy, the learned bases are like digits
    each basis has a clear local class association.
  • Implication exploring data geometry may be
    useful for classification.

13
Distribution of coefficients (MNIST)
Neighbor bases tend to get nonzero coefficients
14
Distribution of coefficient (SIFT, Caltech101)
Similar observation here!
15
Recap two different views to sparse coding
  • View 1
  • Discover topic components
  • Each basis is a direction
  • Sparsity each datum is a linear combination of
    several bases.
  • Related to topic model
  • View 2
  • Geometric structure of data manifold
  • Each basis is an anchor point
  • Sparsity each datum is a linear combination of
    neighbor anchors.
  • Somewhat like a soft VQ (link to BoW)
  • Either can be valid for sparse coding under
    certain circumstances.
  • View 2 seems to be helpful to sensory data
    classification.

16
Outline of Part 3
  • Why can sparse coding learn good features?
  • Intuition, topic model view, and geometric view
  • A theoretical framework local coordinate coding
  • Two practical coding methods
  • Recent advances in sparse coding for image
    classification

17
Key theoretical question
  • Why unsupervised feature learning via sparse
    coding can help classification?

18
The image classification setting for analysis
Implication Learning an image classifier is a
matter of learning nonlinear functions on
patches.
19
Illustration nonlinear learning via local coding
data points
bases
20
How to learn a nonlinear function?
Step 1 Learning the dictionary from unlabeled
data
21
How to learn a nonlinear function?
Step 2 Use the dictionary to encode data
22
How to learn a nonlinear function?
Step 3 Estimate parameters
Sparse codes of data
  • Nonlinear local learning via learning a global
    linear function.

23
Local Coordinate Coding (LCC) connect coding to
nonlinear function learning
Yu et al NIPS-09
If f(x) is (alpha, beta)-Lipschitz smooth
The key message A good coding scheme should 1.
have a small coding error, 2. and also be
sufficiently local
Locality term
Function approximation error
Coding error
24
Outline of Part 3
  • Why can sparse coding learn good features?
  • Intuition, topic model view, and geometric view
  • A theoretical framework local coordinate coding
  • Two practical coding methods
  • Recent advances in sparse coding for image
    classification

25
Application of LCC theory
  • Fast Implementation with a large dictionary
  • A simple geometric way to improve BoW

Wang et al, CVPR 10
Zhou et al, ECCV 10
26
Application of LCC theory
  • Fast Implementation with a large dictionary
  • A simple geometric way to improve BoW

27
The larger dictionary, the higher accuracy, but
also the higher computation cost
Yu et al NIPS-09
Yang et al CVPR 09
The same observation for Caltech-256, PASCAL,
ImageNet,
28
Locality-constrained linear coding a fast
implementation of LCC
Wang et al, CVPR 10
  • Dictionary Learning k-means (or hierarchical
    k-means)
  • Coding for X,
  • Step 1 ensure locality find the K nearest
    bases
  • Step 2 ensure low coding error

29
Competitive in accuracy, cheap in computation
Comparable with sparse coding
This is one of the two major algorithms applied
by NEC-UIUC team to achieve the No.1 position in
ImageNet challenge 2010!
Sparse coding
Significantly better than sparse coding
Wang et al CVPR 10
30
Application of the LCC theory
  • Fast Implementation with a large dictionary
  • A simple geometric way to improve BoW

31
Interpret BoW linear classifier
32
Super-vector coding a simple geometric way to
improve BoW (VQ)
Zhou et al, ECCV 10
33
Super-vector coding a simple geometric way to
improve BoW (VQ)
If f(x) is beta-Lipschitz smooth, and
Quantization error
Function approximation error
34
Super-vector coding learning nonlinear function
via a global linear model
Let be the VQ
coding of
This is one of the two major algorithms applied
by NEC-UIUC team to achieve the No.1 position in
PASCAL VOC 2009!
35
Summary of Geometric Coding Methods
Super-vector Coding
  • All lead to higher-dimensional, sparse, and
    localized coding
  • All explore geometric structure of data
  • New coding methods are suitable for linear
    classifiers.
  • Their implementations are quite straightforward.

36
Things not covered here
  • Improved LCC using Local Tangent, Yu Zhang,
    ICML10
  • Mixture of Sparse Coding, Yang et al ECCV 10
  • Deep Coding Network, Lin et al NIPS 10
  • Pooling methods
  • Max-pooling works well in practice, but appears
    to be ad-hoc.
  • An interesting analysis on max-pooling, Boureau
    et al. ICML 2010
  • We are working on a linear pooling method, which
    has a similar effect as max-pooling. Some
    preliminary results already in the super-vector
    coding paper, Zhou et al, ECCV2010.

37
Outline of Part 3
  • Why can sparse coding learn good features?
  • Intuition, topic model view, and geometric view
  • A theoretical framework local coordinate coding
  • Two practical coding methods
  • Recent advances in sparse coding for image
    classification

38
Fast approximation of sparse coding via neural
networks
Gregor LeCun, ICML-10
  • The method aims at improving sparse coding speed
    in coding time, not training speed, potentially
    make sparse coding practical for video.
  • Idea Given a trained sparse coding model, use
    its input outputs as training data to train a
    feed-forward model
  • They showed a speedup of X20 faster. But not
    evaluated on real video data.

39
Group sparse coding
Bengio et al, NIPS 09
  • Sparse coding is on patches, the image
    representation is unlikely sparse.
  • Idea enforce joint sparsity via L1/L2 norm on
    sparse codes of a group of patches.
  • The resultant image representation becomes
    sparse, which can save the memory cost, but the
    classification accuracy decreases.

40
Learning hierarchical dictionary
Jenatton, Mairal, Obozinski, and Bach, 2010
A node can be active only if its ancestors are
active.
41
Reference
  1. Image Classification using Super-Vector Coding of
    Local Image Descriptors, Xi Zhou, Kai Yu, Tong
    Zhang, and Thomas Huang. In ECCV 2010.
  2. Efficient Highly Over-Complete Sparse Coding
    using a Mixture Model, Jianchao Yang, Kai Yu, and
    Thomas Huang. In ECCV 2010.
  3. Learning Fast Approximations of Sparse Coding,
    Karol Gregor and Yann LeCun. In ICML 2010.
  4. Improved Local Coordinate Coding using Local
    Tangents, Kai Yu and Tong Zhang. In ICML 2010.
  5. Sparse Coding and Dictionary Learning for
    Image Analysis, Francis Bach,  Julien Mairal,
    Jean Ponce, and Guillermo Sapiro. CVPR 2010
    Tutorial
  6. Supervised translation-invariant sparse coding,
    Jianchao Yang, Kai Yu, and Thomas Huang, In CVPR
    2010.
  7. Learning locality-constrained linear coding for
    image classification, Jingjun Wang, Jianchao
    Yang, Kai Yu, Fengjun Lv, Thomas Huang, and
    Yihong Gong. In CVPR 2010.
  8. Group Sparse Coding, Samy Bengio, Fernando Pereira
    , Yoram Singer, and Dennis  Strelow, In
    NIPS2009.
  9. Nonlinear learning using local coordinate coding,
    Kai Yu, Tong Zhang, and Yihong Gong. In
    NIPS2009.
  10. Linear spatial pyramid matching using sparse
    coding for image classification, Jianchao Yang,
    Kai Yu, Yihong Gong, and Thomas Huang. In CVPR
    2009.
  11. Efficient sparse coding algorithms. Honglak Lee,
    Alexis Battle, Raina Rajat and Andrew Y.Ng. In
    NIPS2007.
Write a Comment
User Comments (0)
About PowerShow.com