WP 6: Emotion in Interaction - PowerPoint PPT Presentation

About This Presentation
Title:

WP 6: Emotion in Interaction

Description:

INESC-ID - Lisbon. TCD - Dublin. University of Bari. ISTC-CNR - Rome ... Fake joy. Sadness masked by joy. Neutral expression. Joy. Sadness. Superposition of ... – PowerPoint PPT presentation

Number of Views:224
Avg rating:3.0/5.0
Slides: 23
Provided by: schr8
Category:
Tags: emotion | fake | id | interaction

less

Transcript and Presenter's Notes

Title: WP 6: Emotion in Interaction


1
  • WP 6 Emotion in Interaction
  • Catherine Pelachaud, U Paris 8

Plenary, 4-6 June 2007, Paris
2
WP 6 The area
  • Research theme role of emotion in interaction.
  • Three domains of study
  • Perception domain how certain aspects related to
    cognition may influence agents actions
  • Interaction domain how to create relations
    between users and agents how the agent can
    provide feedback
  • Generation domain how to show expressive
    behaviours consistently and naturally across
    modalities

3
WP 6 the main teams
  • University of Paris8
  • DIST - University of Genova
  • EPFL - Lausanne
  • France-Telecom
  • ICCS - Athens
  • Limsi CNRS
  • OFAI - Wien
  • T-systems - Berlin
  • KTH - Stockholm
  • MIRALab - Geneva
  • DFKI - Saarburcken
  • University of Ausburg
  • University of Hertfordshire
  • University of Paris 8
  • University of Sheffield
  • Twente University
  • INESC-ID - Lisbon
  • TCD - Dublin
  • University of Bari
  • ISTC-CNR - Rome

4
PHIPS
  • Definition of an Affective Interactive Embodied
    Conversational Agent that encompasses the
    capabilities
  • Cognitive Influences on Action
  • Creating Affective Awareness
  • Backchannel properties and architecture
  • Coordination of signs in multi modalities
  • Expressive behaviour and speech

5
Element 1Cognitive influence on actions
  • Agent perceptual attention (UP8)
  • Agents with real-time synthetic vision, attention
    and memory capabilities
  • Model of attention and emotion aspects related to
    facial expression and novelty relation (WP3 /
    WP6)
  • Evaluation study of the visual perception model
  • GPU-based visual attention speed-up for real-time
    perception model (WP6 / WP7)

6
Element 1Cognitive influence on actions DEMO -
UA
  • Reaction to Agents expressions (UA)
  • Integration of tangible input device, speech
    recognition, emotional
  • behavior control
  • Analysis of users gaze behavior

7
Element 1 Visual attention in affective agents
  • Cooperation between UA and NII, Tokyo
  • Investigation of the relationship between visual
    attention and affect ? Combining bio sensor with
    eye tracking technology
  • Conduction of an empirical study under the
    leadership of Helmut Prendinger to investigate
    the potential benefits of attentive presentation
    agents

Prendinger, Bee, Nischt, 2006
8
Element 2 Creating Affective Awareness
  • Investigation of the level of users engagement
    with one another and with an ECA in an
    emotionally rich context (UA, HU, UP8, DIST,
    ICCS, KTH)
  • create affective relationship with others humans
    / objects
  • study of users engagement
  • when initiating, maintaining, ending an
    interaction
  • through music, emotion and movement
  • detection and imitation ability to replicate
    emotional state

9
Element 2 Creating Affective Awareness
  • Expressive control of music and visual media by
    full-body movement
  • Collaboration between InfoMus Lab-DIST
    (University of Genova) and KTH (Royal Institute
    of Technology, Stockholm)
  • Development of a system allowing users to express
    themselves through their full-body movement and
    to control in real-time the generation of an
    audio-visual feedback
  • System based on the integration of two different
    software platforms EyesWeb (for movement
    analysis and visual feedback generation) and pDM
    (to synthesize in real-time expressive music
    performances)

10
Element 2 Creating Affective Awareness DEMO
  • The real-time audio-visual feedback consists of
  • (i) the rendering of a music performance with
    different emotional characterisations by
    manipulating acoustic parameters
  • ? the dynamic variations of the motor cues
    control the dynamics of acoustic cues such as
    tempo, sound level, articulation
  • (ii) the rendering of the user's silhouette
    on a big screen in front of them coloured
    depending on the expressivity of their movement

11
Element 2 Creating Affective Awareness
  • Development of realtime continuous emotion
    recognition from the speech signal (UA)
  • Implementation of system to mirror the users
    affective state by using
  • the Greta agent (UA)
  • the empathic anthropomorphic robot (Collaboration
    between UA and Bielefeld University)

12
Element 3 Backchannels
  • Three communication levels
  • establishing and maintaining engagement (contact,
    perception, attention) (WP4)
  • comprehension (understand, interest)
  • reaction (believability, attitude, agreement)
  • Three different dimensions to characterise
    backchannel signals
  • cognitive/reactive (signals done with/without
    explicit planning)
  • sincere/deceptive (sincerity/goal to deceive
    ones reaction)
  • imitation/dictionary (signal of alignment,
    positive/negative signals)
  • Backchannel forms verbal and nonverbal signals

13
Element 3 Backchannels
  • Insight
  • Data collection and analysis
  • Theory and models
  • Perceptual tests of affective bursts and facial
    expressions
  • Modeling and Implementation
  • Recognition
  • Decision
  • Generation
  • Testing and evaluation
  • DFKI, UTwente, URoma, UParis8, ISTC-CNR

14
Element 3 Backchannels Dialogue Management
  • Integration of the various components of a
    dialogue system capable of non-verbal
    expressivity
  • a visual renderer (Greta),
  • an audio renderer (MARY), and
  • a dummy dialogue system capable of generating
    non-verbal behaviour (Conversational Dialogue
    Engine / DFKI)
  • Using OpenAIR

15
Element 3 Backchannels
16
Element 4 Coordination of signs in multiple
modalities
  • Models of coordination between modalities built
    from
  • Automatic analysis of instructed/acted behaviors
    (ICCS)
  • Manual annotation of spontaneous behaviors
    (CNRS-LIMSI, UP8)
  • Perceptual studies
  • Comparison of the original video
  • with 4 animations
  • basic emotion 1 (e.g. Anger)
  • basic emotion 2 (e.g. Despair)
  • multiple levels replay
  • facial blending replay (UP8)

17
Element 4 Coordination of signs in multiple
modalities
  • Models of individual expressive behaviors in each
    modality example of reaction movements (EPFL)
  • Semantic representations find concepts and
    relationships among them
  • Morphological Descriptors height, gender, age,
    etc.
  • Individuality personality, emotional state,
    cultural background, etc.
  • Body geometry, skeletal structure
  • Behavior Controllers inputs required for
    algorithm to work and output it produces.
  • Reaction behavior
  • Inverse Kinematics

18
Element 5 Expressivity
  • Expressive behaviour analysis/synthesis of
    expressive behaviours (DIST, OFAI, UP8, ICCS)
  • Expressive speech synthesis blending of
    emotions, control of voice quality in speech
    synthesis, copy synthesis of emotional speech
    (DFKI, FT, T-S)
  • Model of complex emotions (UP8)




Reliable features of sadness
Fake joy
Sadness masked by joy
Neutral expression
EmoTV
19
Element 5 Expressivity
Effects of Expressivity parameters over head,
facial expression and gesture over different time
span gesture phase, whole gesture, whole sequence
behavior mimicry (ICCS-UP8)
20
Element 5 Expressivity
  • GEMEP Corpus of acted emotional performances
    created by WP3/ Geneva.
  • Feature Extraction from Audio Channel (OFAI)
  • Phonetic segmentation into
  • Phonemes
  • Syllables
  • Pitch Extraction
  • Features from Video Channel (OFAI, DIST, UP8)
  • Face detection
  • Silhouettes Bounding Boxes
  • Hand tracking
  • Manual annotation and replay (UP8)

21
Element 5 Expressivity
  • Restitution of salient information in
    human-machine interactions (FT)
  • Prosodic copy (F0 duration) of the focused part
    (words) from dedicated corpus to neutral
    synthesized utterances
  • TD-PSOLA copy synthesis on the focused part
  • Use focused part as target in unit selection
  • Synergies with national project PAVOQUE on
    parameterisation of prosody and voice quality for
    expressivity in speech synthesis (DFKI)
  • Spectral interpolation using LSF
  • Voice adaptation with HMM synthesis
  • Emofilt emotional speech synthesis by prosody
    transformation (T-S)
  • Interface to DFKIs MARY TTS
  • Available in 34 languages
  • Meant as a pragmatic tool

sad
anger
Screenshot of Emofilt GUI
22
Conclusion
  • Creation of affective ECA able to
  • Perceive, adapt, respond affectively to events,
    objects, people in real/virtual world
  • Create affective bonds
  • Provide affective feedback
  • Be multimodal and expressive
Write a Comment
User Comments (0)
About PowerShow.com