Title: From Facial Features to Facial Expressions
1From Facial Featuresto Facial Expressions
- A.Raouzaiou, K.Karpouzis and S.Kollias
Image, Video and Multimedia Systems
LaboratoryNational Technical University of Athens
2Outline
- The concept of archetypal expressions
- FAPs-based description and estimation of FAPs
- Expression synthesis using profiles
- Synthesis of intermediate emotions
3Archetypal Expressions
Also termed universal because they are recognized
across cultures
Source F. Parke and K. Waters, Computer Facial
Animation, A K Peters
4Archetypal Expressions (cont.)
Description of the archetypal expressions through
muscle actions
Action Units (AUs) - FACS
Translation of facial muscle movements into FAPs
e.g. sadness close_t_l_eyelid, close_t_r_eyelid,
close_b_l_eyelid, close_b_r_eyelid,
raise_l_i_eyebrow, raise_r_i_eyebrow,
raise_l_m_eyebrow, raise_r_m_eyebrow,
raise_l_o_eyebrow, raise_r_o_eyebrow
Creation of FAPs vocabulary for every archetypal
expression
5FAPs-based description
- Discrete features offer a neat, symbolic
representation of expressions - Not constrained to a specific face model
- Suitable for face cloning applications
- MPEG-4 compatible
- Based on feature points, not complete features
6FAPs-based description (cont.)
Two issues should be addressed
- choice of FAPs involved in profiles formation
- definition of FAP intensities
7Expression synthesis
- Choice of FAPs is based on psychological data
- Intensities are derived from expression database
images
8Estimation of FAPs
- Absence of clear quantitative definition of FAPs
- It is possible to model FAPs through FDP feature
points movement using distances s(x,y)
e.g. close_t_r_eyelid (F20) - close_b_r_eyelid
(F22) ? D13s (3.2,3.4) ? f13 D13 - D13-NEUTRAL
9Sample FAP vocabulary
Sadness close_t_l_eyelid(F19),
close_t_r_eyelid(F20 ), close_b_l_eyelid (F21),
close_b_r_eyelid(F22), raise_l_i_eyebrow(F31),
raise_r_i_eyebrow(F32 ), raise_l_m_eyebrow(F33),
raise_r_m_eyebrow(F34), raise_l_o_eyebrow(F35),
raise_r_o_eyebrow(F36)
10Archetypal Expression Profiles
Profile set of FAPs accompanied by the
corresponding range of variation
11Sample Profiles of Anger
A1 F422, 124, F31-131, -25, F32-136,-34,
F33-189,-109, F34-183,-105, F35-101,-31,
F36-108,-32, F3729,85, F3827,89 A2
F19-330,-200, F20-335,-205, F21200,330,
F22205,335, F31-200,-80, F32-194,-74,
F33-190,-70, F34-190,-70 A3 F19
-330,-200, F20-335,-205, F21200,330,
F22205,335, F31-200,-80, F32-194,-74,
F3370,190, F3470,190
12Emotion representation
Emotions can be approached as points on a plane
defined by activation and evaluation
13Intermediate Expression Profiles
- Same universal emotion category
- Animation of the same FAPs using different
intensities - Absence of expert knowledge for the (, )
quadrant
worry lt fear lt terror
14Intermediate Expression Profiles
- Different universal emotion categories
- In the same evaluation half-plane
- Averaging of FAPs used in universal emotions
15Intermediate Expression Profiles
- Different universal emotion categories
afraid sad depressed
16Conclusions
- FAPs provide a compact and established means of
emotion representation - Necessary input from psychological and
physiological studies - Universal emotions can be used to synthesize
intermediate ones - Useful for low-bitrate MPEG-4 applications
17Extensions
- Verification Evaluation
- Initial results
- Acceptable performance for expression grading
- Intermediate expressions better results for the
negative evaluation half plane - Lack of linguistic rules for the (, -) quadrant
18Extensions
- Personalized ECAs
- Detected facial feature points can be used to
adapt a generic ECA head (FDP FPs) - Intermediate emotions based on processing real
data (FAP extraction) - Processing real data ? temporal aspect of FAPs