Probabilistic State Machines to describe emotions - PowerPoint PPT Presentation

About This Presentation
Title:

Probabilistic State Machines to describe emotions

Description:

Speak 'Professor Perky', blinks eyes twice. Speak 'In the classroom' ... Speak 'Professor Perky' Speak 'Doctor Lee' Speak 'in some location', smiles broadly ... – PowerPoint PPT presentation

Number of Views:138
Avg rating:3.0/5.0
Slides: 74
Provided by: ee516
Learn more at: http://web.cecs.pdx.edu
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic State Machines to describe emotions


1
Probabilistic State Machines to describe emotions
Human speaks to robot
you are beautiful / Thanks for a compliment
P1
Robot speaks to human (will be in italic in next
slides)
Happy state
you are blonde! / I am not an idiot
P0.3
you are blonde! / Do you suggest I am an
idiot?
Unhappy state
P0.7
Ironic state
2
Facial Behaviors of Maria
Do I look like younger than twenty three?
Maria asks
Response from a human
  • no
  • no
  • yes

0.7
0.3
In word spotting mode human responses are short
Maria smiles
Maria frowns
3
Probabilistic Grammars for performances
Speak Professor Perky, blinks eyes twice
P0.1
Speak Professor Perky
Where?
P0.3
Who?
P0.5
P0.5
P0.5
Speak in some location, smiles broadly
Speak In the classroom, shakes head
Speak Doctor Lee
What?
P0.1
P0.1
P0.1
Speak Was singing and dancing
P0.1
Speak Was drinking wine
A complete behavior of a robot described as a
probabilistic (quantum entangled) machine
.
4
Human-controlled modes of dialog/interaction
Human teaches
Thanks, I have a lesson
initialization
Hello Maria
Lesson finished
Robot performs
Robot asks
Question
Stop performance
Questioning finished
Command finished
This slide shows only human speech controls, no
robot responses showing its state
Thanks, I have a question
Thanks, I have a command
Human commands
Human asks
5
The next slides will illustrate the following
robot behaviors
  • Robot Initiated Conversation.
  • Human-Initiated Conversation.
  • Human asking.
  • Human teaching.
  • Human verifying what robot learned.

Other robot behaviors
  • Robot Performing (text, music, movements).
  • Game Playing.

6
Robot-Receptionist Initiated Conversation
Human
Robot
What can I do for you?
Robot asks
This represents operation mode
7
Robot-Receptionist Initiated Conversation
Human
Robot
What can I do for you?
I would like to order a table for two
Robot asks
8
Robot-Receptionist Initiated Conversation
Human
Robot
Smoking or non-smoking?
Robot asks
9
Robot-Receptionist Initiated Conversation
Human
Robot
Smoking or non-smoking?
I do not understand
Robot asks
10
Robot-Receptionist Initiated Conversation
Human
Robot
Do you want a table in a smoking or non-smoking
section of the restaurant? Non-smoking section
is near the terrace.
Robot asks
11
Robot-Receptionist Initiated Conversation
Human
Robot
Do you want a table in a smoking or non-smoking
section of the restaurant? Non-smoking section
is near the terrace.
A table near the terrace, please
Robot asks
12
Human-Initiated Conversation
Human
Robot
Hello Maria
initialization
Robot asks
13
Human-Initiated Conversation
Human
Robot
Hello Maria
What can I do for you?
Robot asks
14
Human-Asking
Human
Robot
Question
Question
Robot asks
Human asks
15
Human-Asking
Human
Robot
Question
Yes, you ask a question.
Human asks
16
Human-Asking
Human
Robot
What book wrote Lee?
Yes, you ask a question.
Human asks
17
Human-Asking
Human
Robot
What book wrote Lee?
I have no sure information.
Human asks
18
Human-Asking
Human
Robot
Try to guess.
I have no sure information.
Human asks
19
Human-Asking
Human
Robot
Try to guess.
Lee wrote book Flowers.
Human asks
20
Human-Asking
Human
Robot
This is not true.
Lee wrote book Flowers.
Human asks
21
Human-Teaching
Human ends questioning
Human
Robot
Questioning finished
Human asks
Human teaches
Questioning finished
Robot asks
Thanks, I have a lesson
22
Human-Teaching
Robot enters asking mode
Human
Robot
Questioning finished
What can I do for you?
Human asks
Human teaches
Questioning finished
Robot asks
Thanks, I have a lesson
23
Human-Teaching
Human starts teaching
Human
Robot
Thanks, I have a lesson
What can I do for you?
Human asks
Human teaches
Questioning finished
Robot asks
Thanks, I have a lesson
24
Human-Teaching
Human
Robot
Thanks, I have a lesson
Yes
Human teaches
25
Human-Teaching
Human
Robot
I give you question-answer pattern
Yes
Human teaches
26
Human-Teaching
Human
Robot
Question pattern What book Smith wrote?
Yes
Human teaches
27
Human-Teaching
Human
Robot
Answer pattern Smith wrote book Automata
Theory
Yes
Human teaches
28
Human-Teaching
Human
Robot
Checking question What book wrote Smith?
Yes
Human teaches
29
Human-Teaching
Human
Robot
Checking question What book wrote Smith?
Smith wrote book Automata Theory
Human teaches
30
Human-Teaching
Human
Robot
I give you question-answer pattern
Yes
Human teaches
31
Human-Teaching
Human
Robot
Question pattern Where is room of Lee?
Yes
Human teaches
32
Human-Teaching
Human
Robot
Answer pattern Lee is in room 332
Yes
Human teaches
33
Human-Checking what robot learned
Human
Robot
Lesson finished
Human teaches
Lesson finished
Question
Robot asks
Human asks
34
Human-Checking what robot learned
Human
Robot
Lesson finished
What can I do for you?
Human teaches
Lesson finished
Question
Robot asks
Human asks
35
Human-Checking what robot learned
Human
Robot
Question
What can I do for you?
Human teaches
Lesson finished
Question
Robot asks
Human asks
36
Human-Asking
Human
Robot
Question
Yes, you ask a question.
Human teaches
Lesson finished
Question
Robot asks
Human asks
37
Human-Asking
Human
Robot
What book wrote Lee?
Yes, you ask a question.
Human asks
38
Human-Asking
Human
Robot
What book wrote Lee?
I have no sure information.
Human asks
39
Human-Asking
Human
Robot
Try to guess.
I have no sure information.
Human asks
40
Human-Asking
Human
Robot
Try to guess.
Lee wrote book Automata Theory
Observe that robot found similarity between Smith
and Lee and generalized (incorrectly)
Human asks
41
How we linked Behavior, Dialog and Learning
  • The dialog/behavior has the following components
  • (1) Eliza-like natural language dialogs based on
    pattern matching and limited parsing.
  • Commercial products like Memoni, Dog.Com, Heart,
    Alice, and Doctor all use this technology, very
    successfully for instance Alice program won the
    2001 Turing competition.
  • This is a conversational part of the robot
    brain, based on pattern-matching, parsing and
    black-board principles.
  • It is also a kind of operating system of the
    robot, which supervises other subroutines.

42
Behavior, Dialog and Learning
  • (2) Subroutines with logical data base and
    natural language parsing (CHAT).
  • This is the logical part of the brain used to
    find connections between places, timings and all
    kind of logical and relational reasonings, such
    as answering questions about Japanese geography.
  • (3) Use of generalization and analogy in dialog
    on many levels.
  • Random and intentional linking of spoken
    language, sound effects and facial gestures.
  • Use of Constructive Induction approach to help
    generalization, analogy reasoning and
    probabilistic generations in verbal and
    non-verbal dialog, like learning when to smile or
    turn the head off the partner.

43
Behavior, Dialog and Learning
  • (4) Model of the robot, model of the user,
    scenario of the situation, history of the dialog,
    all used in the conversation.
  • (5) Use of word spotting in speech recognition
    rather than single word or continuous speech
    recognition.
  • (6) Avoidance of I do not know, I do not
    understand answers from the robot.
  • Our robot will have always something to say, in
    the worst case, over-generalized, with not valid
    analogies or even nonsensical and random.

44
(No Transcript)
45
(No Transcript)
46
Generalization of the Ashenhurst-Curtis
decomposition model
47
This kind of tables known from Rough Sets,
Decision Trees, etc Data Mining
48
Decomposition is hierarchical
At every step many decompositions exist
49
Constructive Induction Technical Details
  • U. Wong and M. Perkowski, A New Approach to
    Robots Imitation of Behaviors by Decomposition
    of Multiple-Valued Relations, Proc. 5th Intern.
    Workshop on Boolean Problems, Freiberg, Germany,
    Sept. 19-20, 2002, pp. 265-270.
  • A. Mishchenko, B. Steinbach and M. Perkowski, An
    Algorithm for Bi-Decomposition of Logic
    Functions, Proc. DAC 2001, June 18-22, Las Vegas,
    pp. 103-108.
  • A. Mishchenko, B. Steinbach and M. Perkowski,
    Bi-Decomposition of Multi-Valued Relations, Proc.
    10th IWLS, pp. 35-40, Granlibakken, CA, June
    12-15, 2001. IEEE Computer Society and ACM SIGDA.

50
Constructive Induction
  • Decision Trees, Ashenhurst/Curtis hierarchical
    decomposition and Bi-Decomposition algorithms are
    used in our software
  • These methods create our subset of MVSIS system
    developed under Prof. Robert Brayton at
    University of California at Berkeley 2.
  • The entire MVSIS system can be also used.
  • The system generates robots behaviors (C program
    codes) from examples given by the users.
  • This method is used for embedded system design,
    but we use it specifically for robot interaction.

51
Additional Slides with Background
52
Ashenhurst Functional Decomposition
Evaluates the data function and attempts to
decompose into simpler functions.
F(X) H( G(B), A ), X A ? B
X
B - bound set
if A ? B ?, it is disjoint decomposition if A ?
B ? ?, it is non-disjoint decomposition
53
A Standard Map of function z
Explain the concept of generalized dont cares
Bound Set
a b \ c
Columns 0 and 1 and columns 0 and 2 are
compatible column compatibility 2
Free Set
z
54
NEW Decomposition of Multi-Valued Relations
F(X) H( G(B), A ), X A ? B
A
X
Relation
Relation
B
if A ? B ?, it is disjoint decomposition if A ?
B ? ?, it is non-disjoint decomposition
55
Forming a CCG from a K-Map
Columns 0 and 1 and columns 0 and 2 are
compatible column compatibility index 2
Column Compatibility Graph
z
56
Forming a CIG from a K-Map
Columns 1 and 2 are incompatible chromatic number
2
Column Incompatibility Graph
57
Constructive Induction
  • A unified internal language is used to describe
    behaviors in which text generation and facial
    gestures are unified.
  • This language is for learned behaviors.
  • Expressions (programs) in this language are
    either created by humans or induced automatically
    from examples given by trainers.

58
The integrated approach to robot vision and
speech based dialogs
59
Open CV image processing software from Intel
60
Hidden Markov Model Based Face Recognition
61
Hidden Markov Model Based Face Recognition
62
Braitenberg Vehicles
  • Two sensors two motors
  • Many behaviors from simple rules
  • Control can be a combinational mapping or
    automaton
  • Combinational logic can be binary,
    multiple-valued, fuzzy and quantum.
  • Automaton can be binary, multiple-valued, fuzzy,
    probabilistic, non-deterministic or quantum
    (entangled)
  • This is a new concept in Machine Learning and
    Robotics

63
AND QUANTUM BREITENBERG FACES
64
Hadamard gate
In standard computers probabilistic components
are expensive and have aliasing. In quantum these
are the cheapest gates and they are ideal random
number generators.
65
Square Root of NOT
Two Square root gates composed to an inverter
Deterministic behaviors can be composed from
probability waves. This does not happen outside
quantum world.
Square root controlled gate and its matrix
Square root controlled hermitian and its matrix
66
Analysis of a Quantum Circuit. Matrices in
Hilbert Space
67
Kronecker Product for Quantum Circuit Analysis
Parallel connection of gates requires Kronecker
Product. Serial connection of gates (Previous
slides) requires standard matrix multiplication.
68
Analysis of a Quantum Circuit
69
Analysis of a Quantum Circuit
70
Conclusion. What did we learn
  • (1) the more degrees of freedom the better the
    animation realism.
  • (2) synchronization of spoken text and head
    (especially jaw) movements are important but
    difficult.
  • (3) gestures and speech intonation of the head
    should be slightly exaggerated.

71
Conclusion. What did we learn(cont)
  • (4) the sound should be laud to cover noises
    coming from motors and gears and for a better
    theatrical effect.
  • (5) noise of servos can be also reduced by
    appropriate animation and synchronization.
  • (6) best available ATR and TTS packages should be
    applied, especially those that use word spotting.
  • (7) use puppet theatre experiences.

72
Conclusion. What did we learn(cont)
  • (8) because of a too slow learning, improved
    parameterized learning methods will be developed,
    but also based on constructive induction.
  • (9) open question funny versus beautiful.
  • (10) either high quality voice recognition from
    headset or low quality in noisy room. YOU CANNOT
    HAVE BOTH WITH CURRENT ATR TOOLS.
  • The bi-decomposer of relations and other useful
    software used in this project can be downloaded
    from http//www-cad.eecs.berkeley.edu/mvsis/.

73
Conclusion
Monday, May 10, Intelligent Robotics Laboratory
and Industrial Robotics Laboratory. Demo 10am- 2pm
Thursday, May 13, Convention Center. Demo 10am-
2pm
Sunday, June 6, PSU Balroom Smith Center.
Competition and Demo 10am- 2pm
Help needed. If you can program, interface PCs,
know about networks, want to help with WWW Page,
build robots, learn advanced theories, . You
are welcome to the lab.
Write a Comment
User Comments (0)
About PowerShow.com