Title: EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS
1EMOTION ANALYSIS IN MAN-MACHINE INTERACTION
SYSTEMS
- T. Balomenos, A.Raouzaiou, S.Ioannou,
A.Drosopoulos, K.Karpouzis and S.Kollias
Image, Video and Multimedia Systems
LaboratoryNational Technical University of Athens
2Outline
- Facial Expression Estimation
- Face Detection
- Facial Feature Extraction
- Anatomical Constraints - Anthropometry
- FP Localization
- FAP calculation
- Expression Profiles
- Expression Confidence enforcement - Gesture
analysis
3Face Detection
4Multiple cue Facial Feature boundary extraction
eyes mouth, eyebrows, nose
- Edge-based mask
- Intensity-based mask
- NN-based (Y,Cr,Cb, DCT coefficients of
neighborhood) mask - Each mask is validated independently
5Multiple cue feature extraction an example
6Final mask validation through Anthropometry
Facial distances Male/Female separation measured
by the US Army 30 year period
The measured distances are normalized by division
with Distance 7, i.e. the distance between the
inner corners of left and right eye, both points
the human cannot move.
7Anthropometry based confidence
DA5n, DA10n distances in figures normalized by
division with distance DA7 (DA5nDA5/DA7,DA10nD
A10/DA7) DAewn eye width (calculated from DA5
and DA7) DAewn((DA5-DA7)/2)/DA7
D5n DA5n_min DA5n_max D10n DA10n_min DA10n_max Dew_ln Dew_rn DAewn_min DAewn_max
2.129 2.517 3.349 0.919 1.031 1.515 0.677 0.452 0.840 1.077
8Detected Feature Points (FPs)
9FAP-based description (Facial Animation
Parameters)
- Discrete features offer a neat, symbolic
representation of expressions - Not constrained to a specific face model
- Suitable for face cloning applications
- MPEG-4 compatible unified treatment of analysis
and synthesis parts In MMI environments
10FAPs estimation
- Absence of clear quantitative definition of FAPs
- It is possible to model FAPs through FDP feature
points movement using distances s(x,y)
e.g. close_t_r_eyelid (F20) - close_b_r_eyelid
(F22) ? D13s (3.2,3.4) ? f13 D13 - D13-NEUTRAL
11Sample Profiles of Anger
A1 F422, 124, F31-131, -25, F32-136,-34,
F33-189,-109, F34-183,-105, F35-101,-31,
F36-108,-32, F3729,85, F3827,89 A2
F19-330,-200, F20-335,-205, F21200,330,
F22205,335, F31-200,-80, F32-194,-74,
F33-190,-70, F34-190,-70 A3 F19
-330,-200, F20-335,-205, F21200,330,
F22205,335, F31-200,-80, F32-194,-74,
F3370,190, F3470,190
12Gesture Analysis
- Gestures too ambiguous to indicate emotion on
their own - Gestures are used to support the confidence
outcome of facial expression analysis
Emotion Gesture Class
Joy hand clapping-high frequency
Sadness hands over the head-posture
Anger lift of the hand- high speed
Anger italianate gestures
Fear hands over the head-gesture
Fear italianate gestures
Disgust lift of the hand- low speed
Disgust hand clapping-low frequency
Surprise hands over the head-gesture
HMM gesture class probabilities to emotional
state transformation table
Cr/Cb based hand detection
13Emotion analysis system overview
G the value of a corresponding FAP
f Values derived from the calculated distances
14System Interface
15Conclusions
- Estimation of a users emotional state based on a
fuzzy rules architecture - MPEG-4 a compact and established means for HCI
- Evaluation approach based on anthropometric
models and measurements - Work validating the described developments in the
framework of the IST ERMIS project and HUMAINE