Title: FACE RECOGNITION, EXPERIMENTS WITH RANDOM PROJECTION
1FACE RECOGNITION, EXPERIMENTS WITH RANDOM
PROJECTION
Navin Goel Graduate Student
Advisor Dr George Bebis Associate Professor
Department Of Computer Science and
Engineering University of Nevada, Reno
2Overview
- Introduction and Thesis Scope
- Principal Component Analysis
- Method of Eigenfaces
- Random Projection
- Properties of Random Projection
- Random Projection for Face Recognition
- Experimental Procedure and Data sets
- Recognition approaches and results
- Conclusion and Future work
3Introduction
Problem Statement Identify a persons face image
from face database. Applications
Human-Computer interface, Static matching
of photographs, Video surveillance,
Biometric security, Image and film
processing.
4Challenges
Variations in pose Head positions, frontal view,
profile view and head tilt, facial
expressions Illumination Changes Light direction
and intensity changes, cluttered background,
low quality images Camera
Parameters Resolution, color balance
etc. Occlusion Glasses, facial hair and makeup
5Thesis Scope
Investigate the application of Random Projection
(RP) in Face Recognition. Evaluate the
performance of RP for face recognition under
various conditions and assumptions. Aim at
proposing an algorithm, which replaces the
learning step of PCA by cheaper and efficient
step.
6Principal Component Analysis (PCA)
For a set M of N-dimensional vectors x1, x2xM,
PCA finds the eigenvalues and eigenvectors of the
covariance matrix of the vectors
? - the average of the image vectors
uk - Eigenvectors ?k - Eigenvalues
an image as 1d vector
Keep only k eigenvectors, corresponding to the k
largest eigenvalues.
7Method of Eigenfaces
- Apply PCA on the training dataset
- Project the Gallery set images to the reduced
dimensional eigenspace. - For each test set image
- Project the image to the reduced dimensional
eigenspace. - Measure similarity by calculating the distance
between the projection coefficients of two
datasets - The face is recognized if the closest gallery
image belongs to same person in test set
8Random Projection (RP)
The original N-dimensional data is projected to a
d-dimensional subspace, (d ltlt n)
using Random matrix is calculated using
the following steps Each entry of the matrix
follows N(0,1). The d rows of the matrix
are orthogonalized using Gram-Schmidt algorithm
and then are normalized to unit length
xNxM original data RdxN random matrix
9Random Projection Data Independence
S. Dasgupta. Experiments with Random Projection.
Uncertainty in Artificial Intelligence, 2000.
Random Projection does not depend on the data
itself.
Two 1-separated spherical Gaussians were
projected onto a random space of dimension
20. Error bars are for 1 standard deviation and
there are 40 trials per dimension. Digital
images, document databases, signal processing.
10Random Projection Eccentricity
S. Dasgupta. Experiments with Random Projection.
Uncertainty in Artificial Intelligence, 2000.
RP makes highly eccentric Gaussian clusters to
spherical.
Gaussian in subspace of 50-dimension and
eccentricity 1,000 is projected onto lower
dimensions. Conceptually easier to design
algorithms for spherical clusters than
ellipsoidal ones.
11Random Projection Complexity
E. Bingham and H. Mannila. Random projection in
dimensionality reduction applications to image
and text data. Proceedings of the 7th ACM SIGKDD
International Conference on Knowledge Discovery
and Data Mining, pp. 245-250, August 26-29, 2001.
Complexity of RP is of the order of quadratic
(n2) in contrast to PCA which is cubic (n3).
Number of floating-point operations needed when
reducing the dimensionality of image data using
RP (), SRP (), PCA (?) and DCT (?), in a
logarithmic scale.
12Random Projection Lower Bound
S. Dasgupta. Experiments with Random Projection.
Uncertainty in Artificial Intelligence, 2000.
What value of d (lower space) must be chosen ?
1-separated mixtures of k Gaussians of dimension
100 was projected on d lnk. PCA cannot be
expected to reduce the dimensionality of k
Gaussians below O(k).
13Random Projection for Face Recognition
- Generate lower dimensional random subspace.
- Project the Gallery set images to the reduced
dimensional random space. - For each test set image
- Project the image to the reduced dimensional
random space. - Measure similarity by calculating the distance
between the projection coefficients of two
datasets. - The face is recognized if the closest gallery
image belongs to same person in test set.
14Experimental Procedure
Main steps of the approach
15Data Sets
Face images from ORL data set for a particular
subject.
Face images from CVL data set for a particular
subject.
Face images from AR data set for a particular
subject.
16Closest Match Approach
Averaging over 5 experiments. Flowchart for
calculating recognition rate using closest match
approach.
17Closest Match Approach Majority Voting
Flowchart for calculating recognition rate using
closest match approach majority voting
technique.
18Closest Match Approach Scoring
Flowchart for calculating recognition rate using
closest match approach scoring technique.
19Results for the ORL database
Experiment on ORL database using closest match
approach majority voting technique, where
training set consists of same subjects as in the
gallery and testing set.
Experiment on ORL database using closest match
approach majority voting technique, where
training set consists of different subjects as in
the gallery and testing set.
20Results for the CVL database
Experiment on CVL database using closest match
approach majority voting technique, where
training set consists of same subjects as in the
gallery and testing set.
Experiment on CVL database using closest match
approach majority voting, training set consists
of different subjects as in the gallery and
testing set.
21Results for the AR database
Experiment on AR database using closest match
approach majority voting, training set consists
of random subjects, gallery and Test set contains
different combinations.
22ORL database for Multiple Ensembles
Plot on RCA, Majority-Voting technique for 5 and
30 different random seeds, training set consists
of different subjects as in the gallery and
testing set.
23Results for the ORL database with Scoring
Technique
Experiment on ORL database using closest match
approach scoring, training set consists of same
subjects as in the gallery and testing set.
Experiment on ORL database using closest match
approach scoring, training set consists of
different subjects as in the gallery and testing
set.
24Results for the CVL database with Scoring
Technique
Experiment on CVL database using closest match
approach scoring, training set consists of
different subjects as in the gallery and testing
set.
25Results for the AR database with Scoring
Technique
Experiment on AR database using closest match
approach scoring, training set consists of
random subjects as in the gallery and Test set
contains different combinations.
26Conclusion
- We were able to get recognition rate equivalent
to PCA and in most cases better than it. - RP matrix is independent of the training data.
- The main advantage of using RP is the
computational complexity, for RP it is quadratic
and for PCA cubic. - RP works better when gallery to test set ratio
is higher. - RP works better than PCA when the training set
images differ from gallery and test set. - RP shows irregularity for single runs, but
improves with multiple ensembles. - Majority-voting over closest match for
recognition further improves the performance of
RP. - For scoring technique, greater the number of top
hits per image, better the performance.
27Future Work
- Combine different random ensembles, that will
improve efficiency and accuracy.