Title: Quantification of Facial Asymmetry for Expression-invariant Human Identification
1Quantification of Facial Asymmetry for
Expression-invariant Human Identification
Yanxi Liu yanxi_at_cs.cmu.edu
The Robotics Institute School of Computer
Science Carnegie Mellon University Pittsburgh, PA
USA
2Acknowledgement
- Joint work with Drs. Karen Schmidt and Jeff
Cohn (Psychology, U. Of Pitt). - Students who work on the data as research
projects Sinjini Mitra, Nicoleta Serban, and
Rhiannon Weaver (statistics, CMU), Yan Karklin,
Dan Bohus (scomputer science) and Marc Fasnacht
(physics). - Helpful discussions and advices provided by
Drs. T. Minka, J. Schneider, B. Eddy, A. Moore
and G. Gordon. - Partially funded by DARPA HID grant to CMU
entitled - Space Time Biometrics for Human Identification
in Video
3Human Faces are Asymmetrical
Left Face
Right Face
4 Under Balanced Frontal Lighting (from CMU PIE
Database)
5What is Facial Asymmetry?
- Intrinsic facial asymmetry in individuals is
determined by biological growth, injury, age,
expression - Extrinsic facial asymmetry is affected by viewing
orientation, illuminations, shadows, highlights
6Extrinsic Facial asymmetry on an image is
Pose-variant
Original Image
Left face
Right Face
7Facial Asymmetry Analysis
- A lot of studies in Psychology has been done on
the topics of - attractiveness v. facial asymmetry (Thornhill
Buelthoff 1999) - expression v. facial movement asymmetry
- Identification
- Humans are extremely sensitive to facial
asymmetry - Facial attractiveness for men is inversely
related to recognition accuracy (OToole 1998)
Limitations qualitative, subjective, still
photos
8Motivations
- Facial (a)symmetry is a holistic structural
feature that has not been explored quantitatively
before - It is unknown whether intrinsic facial asymmetry
is characteristic to human expressions or human
identities
9The question to be answered in this work
- How does intrinsic facial asymmetry affect human
face identification?
10DATA Expression VideosCohn-Kanade AU-Coded
Facial Expression Database
Neutral
Peak
joy
anger
disgust
11Sample Facial Expression Frames
Total 55 subjects. Each subject has three
distinct expression videos of varied number of
frames. Total 3703 frames.
Neutral Joy Disgust Anger
12Face Image Normalization
Inner canthus
Philtrum
Affine Deformation based on 3 reference points
13Quantification of Facial Asymmetry
1. Density Difference D-face D (x,y) I(x,y)
I(x,y) I(x,y) --- normalized face image,
I(x,y) --- bilateral reflection of I(x,y) about
face midline 2. Edge Orientation Similarity
S-face S(x,y) cos(?Ie(x,y),Ie(x,y)) where
Ie, Ie are edge images of I and I respectively,
? is the angle between the two gradient vectors
at each pair of corresponding points
14Asymmetry Faces
An half of D-face or S-face contains all the
needed information. We call these half faces Dh,
Sh,Dhx, Dhy, Shx,Shy AsymFaces.
Original D-face S-face
15Facial Asymmetry as a Biometric
Spatiotemporal facial asymmetry of expression
videos
Subj 85
Subj 10
Expression Videos from Cohn-Kanade Database
Original
D-face
S-face
Neutral Peak
16Asymmetry Measure Dhy for two subjects each has
3 distinct expressions
Dhy
Dhy
forehead
forehead
chin
chin
Joy anger disgust
Joy anger disgust
17spatial
temporal
Forehead --? chin
Forehead --? chin
Forehead --? chin
18spatial
temporal
Forehead --? chin
Forehead --? chin
Forehead --? chin
19spatial
Forehead --? chin
Forehead --? chin
Forehead --? chin
20Evaluation of Discriminative Power of Each
Dimension in SymFace Dhy
Variance Ratio
Bridge of nose
forehead
chin
21Most Discriminating Facial Regions Found
22Experiment Setup
55 subjects, each has three expression video
sequences (joy, anger, disgust). Total of 3703
frames. Human identification test is done on
---- Experiment 1 train on joy and anger, test
on disgust Experiment 2 train on joy and
disgust, test on anger Experiment 3 train on
disgust and anger, test on joy Experiment 4
train on neutral expression frames,test on peak
Experiment 5 train on peak expression
frames,test on neutral The above five
experiments are carried out using (1) AsymFaces,
(2) Fisherfaces, and (3) AsymFaces and
FisherFaces together.
23Sample Results Combining Fisherfaces (FF) with
AsymFaces (AF) (Liu et al 2002)
Data set is composed of 55 subjects, each has
three expression videos. There are 1218 joy
frames, 1414 anger frames and 1071 disgust
frames. Total number of frames is 3703.
24All combinations of FF and AF features are tested
and evaluated quantitatively
25Complement Conventional Face Classifier
107 pairs of face images taken from Feret
database. It is shown that asymmetry-signatures
discriminating power demonstrated (1) has a
p value ltlt 0.001 from chance (2) is
independent from features used in conventional
classifiers, decreases the error rate of a PCA
classifier by 38 (15 ? 9.3)
26Quantified Facial Asymmetry used for Pose
estimation
27Summary
- Quantification of facial asymmetry is
computationally feasible. - The intrinsic facial asymmetry of specific
regions captures individual differences that are
robust to variations in facial expression - AsymFaces provides discriminating information
that is complement to conventional face
identification methods (FisherFaces)
28Future Work
- (1) construct multiple, more robust facial
asymmetry measures that can capture intrinsic
facial asymmetry under illumination and pose
variations using PIE as well as publicly
available facial data. - (2) develop computational models for studying
how recognition rates is affected by facial
asymmetry under gender, race, attractiveness,
hyperspectral variations. - (3) study pose estimation using a combination of
facial asymmetry with skewed symmetry.