On Optimal Criteria of Linear Projections for Classification - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

On Optimal Criteria of Linear Projections for Classification

Description:

Y. Koren and L. Carmel, 'Robust linear dimensionality reduction,' IEEE Trans. ... I am currently working on it. 11/9/09. UKCI 2005. 12 ... – PowerPoint PPT presentation

Number of Views:11
Avg rating:3.0/5.0
Slides: 13
Provided by: Gan58
Category:

less

Transcript and Presenter's Notes

Title: On Optimal Criteria of Linear Projections for Classification


1
On Optimal Criteria of Linear Projections for
Classification
John Q. Gan Department of Computer
Science University of Essex, Colchester CO4 3SQ,
UK E-mail jqgan_at_essex.ac.uk
2
1. Introduction
  • Optimal linear projections for feature
    extraction and
  • dimensionality reduction

For data without labels
For data with labels
  • Questions
  • How to select optimal criteria for
    classification purposes?
  • How to do feature extraction and selection
    simultaneously
  • for classification?

3
An old problem
R.A. Fisher, The use of multiple measurements in
taxonomic problems, Annals of Eugenics, vol. 7,
pp. 179-188, 1936.
With the need of new solutions (new optimal
criteria)
Y. Koren and L. Carmel, Robust linear
dimensionality reduction, IEEE Trans. on
Visualization and Computer Graphics, vol. 10, no.
4, pp. 459-470, 2004.
X.-Y. Jing, D. Zhang, and Y.Y. Tang, An improved
LDA approach, IEEE Trans. on Systems, Man, and
Cybernetics Part B, vol. 34, no. 5,
pp.1942-1951, 2004.
4
2. PCA vs. LDA
  • PCA
  • LDA

Different criteria result in different projection
axes.
5
3. An Interpretation of LDA as Supervised PCA
If Sw is non-singular,
Limitation LLDA should be less than C.
Refer to A.M. Martinez and A.C. Kak, PCA
versus LDA, IEEE Trans. on Pattern Analysis and
Machine Intelligence, vol. 23, no. 2, pp.
228-233, 2001.
6
4. Another Supervised PCA Related to LDA
Disadvantages large memory requirement and
heavy computational load when N is large.
Refer to Y. Koren and L. Carmel, Robust
linear dimensionality reduction, IEEE Trans. on
Visualization and Computer Graphics, vol. 10, no.
4, pp. 459-470, 2004.
7
5. A New Supervised PCA Derived from CSP
CSP (Common Spatial Patterns) is a state of the
art method widely used for feature extraction in
EEG signal analysis. It can be reformulated (with
minor modifications) under the framework of PCA
as follows
WPCA is obtained by an ordinary PCA on the whole
data set without using class labels.
8
6. Applications of Supervised PCAs
Mental state identification
(a)
Gender identification
(b)
9
Mental state recognition performance (with
linear classifiers) Data description in the
paper
Recognition Rate
LDA 64.5
SPCA1 62.3
SPCA2 68.0
Recognition Rate
LDA 78.7
SPCA1 74.9
SPCA2 82.3
Gender identification performance (with Bayes
classifiers) Data description in the paper
10
7. Discussion
Linear Projections Optimal Criteria
PCA Maximum variance of the whole projected data
LDA Maximum ratio of between-class variance to within-class variance of the projected data
SPCA1 Maximum variance of the between-class variance of the projected data (in the simplest case)
SPCA2 Maximum ratio of the variance of projected data from one class to the variance of the projected data from other classes
ICA Maximum nongaussianity, maximum likelihood, or minimum mutual information
11
How to do feature extraction and selection
simultaneously for classification? SPCA2 is a
good example. What is the best linear projection
for classification? From my experience, SPCA2
is the best. Will supervised ICA be a better
method by using higher-order statistics as
optimal criteria? I am currently working on it.
12
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com