Quantification of Facial Asymmetry for Expression-invariant Human Identification - PowerPoint PPT Presentation

About This Presentation
Title:

Quantification of Facial Asymmetry for Expression-invariant Human Identification

Description:

Quantification of Facial Asymmetry for Expression-invariant Human ... and advices provided by Drs. T. Minka, J. Schneider, B. Eddy, A. Moore and G. Gordon. ... – PowerPoint PPT presentation

Number of Views:463
Avg rating:3.0/5.0
Slides: 28
Provided by: yan48
Learn more at: http://www.cs.cmu.edu
Category:

less

Transcript and Presenter's Notes

Title: Quantification of Facial Asymmetry for Expression-invariant Human Identification


1
Quantification of Facial Asymmetry for
Expression-invariant Human Identification
Yanxi Liu yanxi_at_cs.cmu.edu
The Robotics Institute School of Computer
Science Carnegie Mellon University Pittsburgh, PA
USA
2
Acknowledgement
  • Joint work with Drs. Karen Schmidt and Jeff
    Cohn (Psychology, U. Of Pitt).
  • Students who work on the data as research
    projects Sinjini Mitra, Nicoleta Serban, and
    Rhiannon Weaver (statistics, CMU), Yan Karklin,
    Dan Bohus (scomputer science) and Marc Fasnacht
    (physics).
  • Helpful discussions and advices provided by
    Drs. T. Minka, J. Schneider, B. Eddy, A. Moore
    and G. Gordon.
  • Partially funded by DARPA HID grant to CMU
    entitled
  • Space Time Biometrics for Human Identification
    in Video

3
Human Faces are Asymmetrical
Left Face
Right Face
4
Under Balanced Frontal Lighting (from CMU PIE
Database)
5
What is Facial Asymmetry?
  • Intrinsic facial asymmetry in individuals is
    determined by biological growth, injury, age,
    expression
  • Extrinsic facial asymmetry is affected by viewing
    orientation, illuminations, shadows, highlights

6
Extrinsic Facial asymmetry on an image is
Pose-variant
Original Image
Left face
Right Face
7
Facial Asymmetry Analysis
  • A lot of studies in Psychology has been done on
    the topics of
  • attractiveness v. facial asymmetry (Thornhill
    Buelthoff 1999)
  • expression v. facial movement asymmetry
  • Identification
  • Humans are extremely sensitive to facial
    asymmetry
  • Facial attractiveness for men is inversely
    related to recognition accuracy (OToole 1998)

Limitations qualitative, subjective, still
photos
8
Motivations
  • Facial (a)symmetry is a holistic structural
    feature that has not been explored quantitatively
    before
  • It is unknown whether intrinsic facial asymmetry
    is characteristic to human expressions or human
    identities

9
The question to be answered in this work
  • How does intrinsic facial asymmetry affect human
    face identification?

10
DATA Expression VideosCohn-Kanade AU-Coded
Facial Expression Database
Neutral
Peak
joy
anger
disgust
11
Sample Facial Expression Frames
Total 55 subjects. Each subject has three
distinct expression videos of varied number of
frames. Total 3703 frames.
Neutral Joy Disgust Anger
12
Face Image Normalization
Inner canthus
Philtrum
Affine Deformation based on 3 reference points
13
Quantification of Facial Asymmetry
1. Density Difference D-face D (x,y) I(x,y)
I(x,y) I(x,y) --- normalized face image,
I(x,y) --- bilateral reflection of I(x,y) about
face midline 2. Edge Orientation Similarity
S-face S(x,y) cos(?Ie(x,y),Ie(x,y)) where
Ie, Ie are edge images of I and I respectively,
? is the angle between the two gradient vectors
at each pair of corresponding points
14
Asymmetry Faces
An half of D-face or S-face contains all the
needed information. We call these half faces Dh,
Sh,Dhx, Dhy, Shx,Shy AsymFaces.
Original D-face S-face
15
Facial Asymmetry as a Biometric
Spatiotemporal facial asymmetry of expression
videos
Subj 85
Subj 10
Expression Videos from Cohn-Kanade Database
Original
D-face
S-face
Neutral Peak
16
Asymmetry Measure Dhy for two subjects each has
3 distinct expressions
Dhy
Dhy
forehead
forehead
chin
chin
Joy anger disgust
Joy anger disgust
17
spatial
temporal
Forehead --? chin
Forehead --? chin
Forehead --? chin
18
spatial
temporal
Forehead --? chin
Forehead --? chin
Forehead --? chin
19
spatial
Forehead --? chin
Forehead --? chin
Forehead --? chin
20
Evaluation of Discriminative Power of Each
Dimension in SymFace Dhy
Variance Ratio
Bridge of nose
forehead
chin
21
Most Discriminating Facial Regions Found
22
Experiment Setup
55 subjects, each has three expression video
sequences (joy, anger, disgust). Total of 3703
frames. Human identification test is done on
---- Experiment 1 train on joy and anger, test
on disgust Experiment 2 train on joy and
disgust, test on anger Experiment 3 train on
disgust and anger, test on joy Experiment 4
train on neutral expression frames,test on peak
Experiment 5 train on peak expression
frames,test on neutral The above five
experiments are carried out using (1) AsymFaces,
(2) Fisherfaces, and (3) AsymFaces and
FisherFaces together.
23
Sample Results Combining Fisherfaces (FF) with
AsymFaces (AF) (Liu et al 2002)
Data set is composed of 55 subjects, each has
three expression videos. There are 1218 joy
frames, 1414 anger frames and 1071 disgust
frames. Total number of frames is 3703.
24
All combinations of FF and AF features are tested
and evaluated quantitatively
25
Complement Conventional Face Classifier
107 pairs of face images taken from Feret
database. It is shown that asymmetry-signatures
discriminating power demonstrated (1) has a
p value ltlt 0.001 from chance (2) is
independent from features used in conventional
classifiers, decreases the error rate of a PCA
classifier by 38 (15 ? 9.3)
26
Quantified Facial Asymmetry used for Pose
estimation
27
Summary
  • Quantification of facial asymmetry is
    computationally feasible.
  • The intrinsic facial asymmetry of specific
    regions captures individual differences that are
    robust to variations in facial expression
  • AsymFaces provides discriminating information
    that is complement to conventional face
    identification methods (FisherFaces)

28
Future Work
  • (1) construct multiple, more robust facial
    asymmetry measures that can capture intrinsic
    facial asymmetry under illumination and pose
    variations using PIE as well as publicly
    available facial data.
  • (2) develop computational models for studying
    how recognition rates is affected by facial
    asymmetry under gender, race, attractiveness,
    hyperspectral variations.
  • (3) study pose estimation using a combination of
    facial asymmetry with skewed symmetry.
Write a Comment
User Comments (0)
About PowerShow.com