What your avatar can reveal about your handwriting - PowerPoint PPT Presentation

About This Presentation
Title:

What your avatar can reveal about your handwriting

Description:

Avatar gesture from pen gestures ... Addison-Wesley, 1993. McCullough, M., Abstracting craft : the practiced digital hand. Cambridge, Massachusetts: ... – PowerPoint PPT presentation

Number of Views:187
Avg rating:3.0/5.0
Slides: 44
Provided by: xxx3101
Category:

less

Transcript and Presenter's Notes

Title: What your avatar can reveal about your handwriting


1
What your avatar can reveal about your handwriting
  • Avatar gesture from pen gestures

Francesca A. Barrientos Computer Science Division
23 May 2002 ? Ph.D. Dissertation Seminar
2
Interacting in avatar worlds
Laurels herb farm in AlphaWorld (Damer 98)
A wedding May 8, 1996 (Damer 98)
3
Avatar nonverbal communication
  • In physical world, language embedded in matrix of
    sounds and visuals
  • Avatar, as a virtual body, can send nonverbal
    communication

4
Problem statement
  • We want an interaction technique for controlling
    avatar gesture
  • Controls expressive movement
  • Seamless with verbal communication

5
Avatar gesture from pen gesture
  • Body gesture has symbolic and qualitative aspects
  • Pen gesture carries symbolic and continuous data
  • Pen gesture simultaneously selects avatar gesture
    and modulates multiple expressive qualities

6
Control using handwritten letter
  • Writing pen gesture triggers animation
  • Body gesture is sweep to side
  • Symbol is letter s l
  • Quality being varied is size

7
Overview
  • Background on nonverbal communication
  • Why its important
  • Why previous control techniques are inadequate
  • Description of interaction technique
  • Description of implementation
  • Conclusions

8
Kinds of nonverbal cues
  • Vocalics
  • Appearance
  • Proxemics Haptics
  • Affective display
  • Kinesics
  • Facial expression
  • Gaze
  • Posture
  • Gesture

9
Gesture types
  • Deictic
  • Iconic
  • Metaphoric
  • Beat
  • Adaptors
  • Regulators

10
Importance of gesture
  • Encodes ideas shared with speech
  • Clarifies meaning when speech is ambiguous
  • Useful cue when outside noise interferes with
    speech
  • Aids utterance generation
  • Smooths over social intercourse
  • Communicates mood/emotion

11
Interaction with speech
  • Coverbal gesture shares meaning units with spoken
    language
  • Language-like gestures fill in for a word or
    phrase in a sentence
  • Emblems have standard forms and well understood
    within a group

12
Observed virtual nonverbal displays
  • Presence
  • Appearance
  • Proxemics
  • Conversation group formation
  • Personal space

WorldsAway from Fujitsu
Avabar by Zuidema (Damer)
13
Designed behaviors
  • Static expressions
  • Facial
  • Posture
  • Gesture
  • Animated motions
  • Gesture
  • Entertaining dances
  • Custom animations

WorldsAway from Fujitsu
14
Related work
  • Commercial worlds
  • Blend of text virtual communities and computer
    graphics
  • Worlds Chat first 3D world in 1995
  • Multi-user virtual environment research
  • Vlnet (Guye-Vuillème et al 98)
  • ComicChat (KurlanderSkellySalesin 96)
  • Autonomous avatars (VilhjálmssonCassell98,
    Cassell et al 94)
  • Acting in virtual reality (Slater et al 00)
  • Synthetic characters
  • Improv (PerlinGoldberg 96)
  • Alive (BlumbergGalyean 95, Maes et al 97)
  • Jack (Badler 97)
  • Computer mediated conversation visualization
  • Collaboration-at-a-glance (Donath 95)
  • Chat Circles (ViegasDonath 99)

15
Vlnet gesture/mimics panel
(Guye-Vuillème et al)
  • Button for each expression or gesture
  • Modulate speed with slider

16
ComicChat
  • Text inference
  • Emoticons
  • Sentence Structure
  • Keywords
  • Emotion wheel

(Kurlander et al)
17
BodyChat avatar agents
  • Automatic conversation regulation behaviors
  • Salutations
  • Envelope feedback
  • Facial expression
  • User specifies high-level intentions
  • Avatar expressions driven by chat text
  • Avatar software manages gestures and gaze behavior

(Vilhjálmsson Cassell)
18
Limitations of current techniques
  • Mainly speech independent nonverbal displays
  • Emotional facial display and posture
  • Emblematic gesture
  • Not general
  • Cumbersome interface
  • Graphical interface requires hunt and select
  • Cannot scale
  • Lack control over multiple expressive features of
    movement

19
Why pen gestures
  • Natural
  • People doodle while talking and listening
  • Expressive
  • Reflects emotional state
  • Very personal
  • Can be intentionally manipulated
  • Dual nature
  • Analog and digital
  • Symbolic and qualitative
  • Discrete and continuous
  • Information and emotion

20
Continuous interaction
  • Computers fragment our thinking by substituting
    discrete events for continuous actions.
  • -Malcolm McCullough
  • (Abstracting Craft, p. 53)
  • Gestures can...enhance the experience of agency
    through kinesthetic involvement and the feeling
    of directness.
  • -Brenda Laurel
  • (Computers as Theatre, p. 158)

21
Interaction technique
LibraryGenerated offline
InputUser writes letter in GUI
GenerationSelection and synthesis
Animation Gesture performed
library
GUI
Gesture generator
Avatar animator
22
Interaction schematic
23
Design issues
  • Pen gesture input set
  • Using letters of the alphabet
  • Mapping from pen gesture features to avatar
    movement parameters
  • Handwriting features to extract
  • Gesture movement parameterization
  • Avatar gesture animation synthesis method

24
Handwriting features
  • Desired properties
  • Controllable
  • Computable
  • Most important feature types (according to
    handwriting analysis)
  • Size
  • Speed
  • Pressure

25
Motion parameterization
  • Formal systems
  • Shape-Effort (Labanotation)
  • Formal sign language systems
  • Physical
  • Size
  • Speed
  • Sustain
  • Emotional
  • Emphatic
  • Listless
  • Tentative

26
Physical mapping
Handwriting feature Movement parameter
Size Size
Speed Duration
27
Gesture synthesis
  • Input gesture and movement parameter values
  • Multi-linear interpolation from set of sample
    gestures
  • Each avatar gesture comprises a set of sample
    motions
  • Each motion sample has different expression
  • Samples are annotated with its movement parameter
    values

28
Modulation through multilinear interpolation
  • Joint trajectory I
  • Rotation angles over time
  • Gesture type ß
  • Semantic category
  • Set of 2n prototypes - G
  • n style parameters
  • Prototypes represent extremal trajectories
  • Gesture instance Iß(u,v )
  • Vector of joint trajectories
  • Multilinear interpolation on type produces
    instance

29
Interpolating speed
  • Interpolation along curves of different durations
  • Time dilation step
  • Determine duration for interpolated curve
  • Choose sample rate on interpolated curve
  • Compress slower curve - sample at proportionately
    slower rate
  • Sample faster trajectory at proportionately
    faster rate

Angle ?
t
30
Framework for applying pen gesture to avatar
gestures
  • Subproblems
  • Expression map design
  • Gesture synthesis technique
  • Can explore other mappings and synthesis
    techniques
  • Labanotation parameterization of movement
  • Emotion parameterization
  • Customize toward particular domain

31
Implementation
  • Cursive
  • Application for interactively controlling VRML
    avatars over the internet
  • Use to test animations
  • Architecture permits independence from specific
    avatar world software
  • Viewer can see animations without installing new
    software
  • Facilitate testing of shared object behaviors in
    virtual worlds

32
Avatar gesture samples
  • Built motion capture system
  • Magnetic position/orientation sensors
  • (Flock of Birds from Ascension Corp.)
  • Recorded gestures
  • Vary size and speed
  • Can create very personalized gesture

33
Cursive screenshot
34
Architecture
35
Communication
Cursive communicates with any copies of users
avatar
driver host
viewer host
  • Driver logs into Vworld server
  • Other viewers receive notification
  • Other viewers request and download avatar copy
  • Avatar opens socket connection to Cursive
  • Cursive sends gesture commands via socket

36
Evaluation
  • Very simple to control one parameter at a time
  • More complicated to control size and speed
    simultaneously
  • Effective usage requires practice
  • Require further investigation into mapping
    handwriting features to movement parameters
  • Viable technique for controlling avatar gesture

37
Expert use
  • the art of finding and executing an effective
    gesture is learned through the more indirect
    means of observation, experimentation,
    performance, and evaluation, and it is a skill
    that continues to grow over time.
  • - Brenda Laurel
  • (Computers as Theatre, p. 155)

38
Summary
  • Want to control richly expressive, spontaneous
    gesture in avatar worlds
  • Solution is an interaction technique employing
    pen gesture input
  • Cursive an implementation of this interaction
    technique

39
Contributions
  • Novel interaction technique that augments the
    potential repertoire of avatar nonverbal
    communication
  • Possible to control spontaneous gesture
  • Control over expressive characteristics of
    gestural movement
  • Framework for applying pen gesture to avatar
    gesture
  • Mapping handwriting features to movement
    parameters
  • Synthesis of expressive gesture
  • The algorithms and methods used to implement the
    technique
  • Cursive, a working application that applies the
    interaction technique

40
Future work
  • Explore other mappings
  • emotional expression in handwriting and gesture
  • Shape-Effort parameterization of movement
  • Explore other gesture synthesis methods
  • Reduce cost of obtaining gesture motion samples
  • Develop a framework for determining avatar
    gesture vocabulary
  • How many gestures
  • What types of gestures

41
Conclusion
  • Transmitting bodily nonverbal communication
    through the internet is an exciting idea.
  • Think of the computer as a medium for personal
    expression
  • Continuous/rich interaction
  • Playful behavior
  • Sense of engagement
  • Handwriting has a place in affective computing

42
Acknowledgements
  • A few of the people who helped with my research
    and with this talk.
  • My dissertation committee Prof. John Canny,
    Prof. James Landay, Prof. John McWhorter.
  • My gesture model Erin Dare.
  • My dissertation writing group Blanca Gordo,
    Jeffrey Ow, Dr. Ellen Sacco-Fernandez, Lynne
    Horiuchi.
  • Other talk critics Miriam Walker, Scott
    Klemmer, Dan Glaser, Jeremy Risner, James Lin,
    Jason Hong, Dr. Eric Paulos.

43
Partial bibliography
  1. Damer, B., Avatars! Exploring and Building
    Virtual Worlds on the Internet. Berkeley, CA
    Peachpit Press, 1998.
  2. Guye-Vuillème, A., T. Capin, I. Pandzic, N.
    Magnenat-Thalmann, and D. Thalmann, "Non-Verbal
    Communication Interface for Collaborative Virtual
    Environments," in Proc. CVE 98, June 1998,
    Manchester, 1998.
  3. Vilhjálmsson, H. H. and J. Cassell, "BodyChat
    Autonomous Communicative Behaviors in Avatars,"
    in Proceedings of the Second International
    Conference on Autonomous Agents, May 9-13, 1998,
    Minneapolis ACM, 1998, pp. 269-276.
  4. Cassell, J., C. Pelachaud, N. Badler, M.
    Steedman, B. Achorn, T. Becket, B. Douville, S.
    Prevost, and M. Stone, "Animated Conversation
    Rule-Based Generation of Facial Expression,
    Gesture and Spoken Intonation for Multiple
    Conversational Agents," in Proceedings of
    SIGGRAPH '94., 1994.
  5. Slater, M., J. Howell, A. Steed, D. P. Pertaub,
    M. Gaurau, and S. Springel, "Acting in Virtual
    Reality," in Collaborative Virtual Environments
    2000 ACM, 2000, pp. 103-110.
  6. Perlin, K. and A. Goldberg, "Improv A System for
    Scripting Interactive Actors in Virtual Worlds,"
    in Proc. Siggraph 96, H. Rushmeier, Ed., New
    York ACM Press, 1996, pp. 205-216.
  7. Blumberg, B. M. and T. A. Galyean, "Multi-level
    direction of autonomous creatures for real-time
    virtual environments," presented at SIGGRAPH, Los
    Angeles, CA, 1995.
  8. Maes, P., T. Darrell, B. Blumberg, and A.
    Pentland, "The ALIVE system wireless, full-body
    interaction with autonomous agents," Multimedia
    Systems, 5, no. 2, 1997, pp. 105-12.
  9. Badler, N., "Virtual Humans for Animation,
    Ergonomics, and Simulation," in IEEE Workshop on
    Non-Rigid and Articulated Motion, June 1997,
    Puerto Rico, 1997.
  10. Donath, J. S., "The illustrated conversation,"
    Multimedia Tools and Applications, 1, no. 1,
    1995, pp. 79-88.
  11. Viegas, F. B. and J. S. Donath, "Chat Circles,"
    in CHI 99 ACM, 1999, pp. 6-19.
  12. Laurel, B., Computers as Theatre. Reading,
    Massachusetts Addison-Wesley, 1993.
  13. McCullough, M., Abstracting craft the practiced
    digital hand. Cambridge, Massachusetts The MIT
    Press, 1996.

44
Network communication
Driver host
Viewer host
World browser
Animation commands sent via socket connection
45
Communicative power of gesture
  • Rich, continuous variation in time and space
  • Same gesture can have different expression
  • Can be combined with speech
  • Can be generated and varied spontaneously
  • Bodily outlet for personal expression

46
Advantages
  • Single action selects and modulates
  • Transparent (no hunting)
  • Expressive modulation of multiple parameters
  • Compact
  • Natural (writing is previously learned skill)

47
Extraction
Type library
Gesture Type
w-
s-
Character recognizer
r-
Inputs modulate gesture
Feature extractor
GUI
Style parameters
Write a Comment
User Comments (0)
About PowerShow.com