Caring Computers - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Caring Computers

Description:

Data' from Star Trek' Computers = rational, logical. Emotions = illogical ... Emotional Relationships with Virtual Characters' from Emotions In Humans and ... – PowerPoint PPT presentation

Number of Views:138
Avg rating:3.0/5.0
Slides: 19
Provided by: NECCompute162
Category:

less

Transcript and Presenter's Notes

Title: Caring Computers


1
Caring Computers Emotional Engines
  • To what extent can technology have emotions,
    express emotions and elicit emotions?
  • Stephen Appleton Becky Wright

2
Aims
  • Is it possible to create emotions in artificial
    systems?
  • What might it mean for a computer to possess
    emotions?
  • What are the potential problems, limitations?
  • Can computers express emotions?
  • Can people perceive technology as capable of
    feelings?

3
Media Perception
  • Suggests emotion is possible in artificial
    lifeforms
  • Can have both negative and positive results
  • e.g. HAL 9000 from 2001 A Space Odyssey
  • Child android from AI
  • Data from Star Trek
  • Computers rational, logical
  • Emotions illogical
  • Emotions are what separates humans from
    machines!

4
Biological
Definition
  • Neurological underpinning
  • Neurotransmitters e.g. dopamine, endorphines
  • Hormones e.g. cortisol, adrenaline
  • Also
  • Physiological states e.g. fear response

AI Focus (Boden, 1996)
Cognitive
  • Functionalist viewpoint mental operations of
    emotions
  • (Boden, 1996)

Conscious
  • Awareness of emotions
  • Feelings, qualia
  • Breadth variety of affective experiences
    (Rolls, 2005)

5
Function of emotions Evolutionary context
(Sloman, 1990 Cañamero, 2002)
Physical limits
Autonomous
Organism
  • Cognitive limits
  • attention
  • memory
  • info-processing

Multiple goals
ENVIRONMENT
Relevant data?
Complex
Unpredictable
6
Function of emotions continued
  • An emotion is usually caused by a person
    consciously or unconsciously evaluating an event
    as relevant to a concern (a goal) that is
    important the emotion is felt as positive when a
    concern is advanced and negative when a concern
    is impeded.
  • (Oatley Jenkins, 1996, p.96)
  • What emotions are about is action (or motivation
    for action) and action control
  • (Frijda, 1995, p.506)
  • Emotions as cognitive appraisers
  • Alerts us to goal relevant events/stimuli
  • Does this in parallel for all concerns
    (Frijda, 1995)
  • Emotions as somatic markers
  • Gut feelings
  • Facilitate decision-making process
    (Damasio, 1994)

7
Goals Beliefs
  • Humans
  • 4 Fs
  • (Feed, Flight, Fight, Reproduction)
  • Natural Selection
  • No purpose, intent behind design
  • Blind Watchmaker (Dawkins, 1986)
  • Machines
  • Task oriented goals
  • Human programmer
  • Specific purpose, intent
  • (Rolls, 2005)
  • Chinese Room problem

If a computer has no goals or concerns of its
own, can it really be said to possess emotions
about such things? (Sloman, 1990)
8
Computers qualia
  • Must a computer be aware of its own emotive
    intentions feelings?
  • Some primary emotions can occur prior to
    conscious awareness (Picard, 1997)
  • Animal, infant emotions lack of
    awareness/intent (Sloman Croucher, 1981)
  • Clinical examples (e.g. repressors)
    (Derakshnan Eysenck,
    1999)
  • Consciousness is a prerequisite only to the
    emotional experience and not to the generation of
    the emotion.
  • (Picard, 1997,
    p.74)
  • We, all of us overanthropomorphize humans, who
    are after all mere machines
  • (Brooks, 2005, cited Blackmore, 2003, p.215)
  • General Problem Solvers (Newell Simon,
    1958, cited Weizenbaum, 1976)

9
Physiological components of emotions
  • Does a lack of biological simulation prevent the
    development of artificial emotions?
  • COG synthetic hormones (Brooks, 2005)
  • James-Lang theory of emotions (James, 1884
    Lang, 1885)

Emotion
  • But.
  • Paraplegics
  • Cold emotions (Sloman, 1990)

10
ELIZA the Psychotherapist
  • Girl My boyfriend made me come here.
  • ELIZA Your boyfriend made you come here
  • Girl He says Im depressed much of the time.
  • ELIZA I am sorry to hear you are depressed
  • Girl Its true. I am unhappy.
  • ELIZA Do you think coming here will help you not
    to be unhappy
  • Girl I need some help, that much seems
    certain.
  • ELIZA What would it mean to you if you got some
    help
  • Girl Perhaps I could learn to get along
    with my mother.
  • ELIZA Tell me more about your family
  • (Weizenbaum, 1976, p.3-4)

11
  • Emotional impact of ELIZA
  • I can imagine the development of a network of
    computer psychotherapeutic terminals, something
    like arrays of large telephone booths, in which,
    for a few dollars a session, we would be able to
    talk with an attentive, tested, and largely
    nondirective psychotherapist.
  • (Dr. C. Sagan, 1975, cited Weizenbaum, p. 5)

12
The Media Equation
  • Human tendency to treat computers as if they are
    real life, social beings. (Reeves Nass,
    2002)
  • Natural, often subconscious
  • Evolutionary backdrop
  • Emergent emotions (Picard, 1997)
  • Human examples e.g. actors, service staff
  • Even no communication is a type of
    communication! (Bartneck et al., 2004)
  • Blank face analogy
  • Ontrack Data-Recovery survey
  • User responses
  • - Sweet-talk computer
  • Physical abuse of computer
  • (Sullivan, 2005)

13
Natural emotions
  • Baldi, computer-generated head. Facial
    expressions as easy to label as human
    equivalents. (Massaro et al., 2000)
  • Synthetic speech recognition of emotional
    content.
  • (Cahn, 1990, cited Brave et al., 2005)

However, can take natural look too far.
(Hara, 2000, cited Menzel DAluisio, 2000)
  • The Uncanny Valley (Mori, 1970s)
  • To be convincing an agents embodiment needs to
    match its skills (Bartneck, 2001)
  • Disturbing if fails to meet expectations
    (Picard Klein, 2002)

14
Kismet
  • Anthropomophic robotic head, specialised for
    face-to-face interaction
  • Infant-like emotional behaviour
  • Endowed with basic emotive system
  • Goals interact with humans play with
    toys rest
  • Emotional expressions
  • Speech babble, emotive intent displayed
    through prosody
  • Empathy skills

15
Conclusion
  • Q. Can computers be programmed with an emotive
    function?
  • A. Depends on your definition of emotions
  • Emotions are a program within a human robot
    (e.g. Brooks Newell Simon), therefore just a
    matter of adding program to artificial robots.
  • Emotions occur at more than just an operational
    level- they are a conscious experience, not a
    simulated behaviour. Computers can never have
    emotions as they can never be conscious.

Are computer emotions real or as if
simulations?...
16
References
  • Arbib, M.A. Fellous, J. (2004) Emotions From
    Brain to Robot Trends in Cognitive Sciences, 8
    (12), p.554-561
  • Bartneck, C. (2001) How Convincing is Mr. Datas
    Smile Affective Expressions of Machines User
    Modeling and User-Adapted Interaction 11,
    p.279-295
  • Bartneck, C., Reichenbach, J. Breeman, A.V.
    (2004) In Your Face, Robot! The Influence of a
    Characters Embodiment on How Users Perceive Its
    Emotional Expressions, web page
    http//www.bartneck.de/work/bartneckDE2004.pdf
  • Blackmore, S. Consciousness, An Introduction
    (2003), Hodder Stoughton, London
  • Boden, M.A. (ed) The Philosophy of Artificial
    Life (1996), Oxford University Press, Oxford
  • Brave, S., Nass, C. Hutchinson, K. (2005)
    Computers That Care Investigating The Effects
    of Orientation of Emotion Exhibited By An
    Embodied Computer Agent Int. J. Human-Computer
    Studies, 62, p. 161-178
  • Breazeal, C. (2003) Emotion and Sociable
    Humanoid Robots Int. J. Human-Computer Studies,
    59, p.119-155
  • Brooks, R. (2005) COG, web page
    http//groups.csail.mit.edu/lbr/humanoid-robotics-
    group/cog/
  • Cañamero, L.D. (2002) Designing Emotions for
    Activity Selection in Autonomous Agents from
    Emotions In Humans and Artifacts Trappl,l R.
    Petta, P. Payr, Sabine. (eds), MIT Press,
    Cambridge, US, p.115-148
  • Damasio, A. R. (2004) William James and The
    Modern Neurobiology of Emotion from Emotion,
    Evolution, and Rationality Evans, D. Pierre,
    C. (eds), Oxford University Press, Oxford, p.3-14
  • Damasio, A.R. Descartes Error Emotion, Reason
    and The Human Brain (1994), Putnam, New York
  • Dawkins, R. The Blind Watchmaker (2000),
    Penguin, London
  • Derakshan, N., Eysenck, M. W. (1999) Are
    repressors self-deceivers or other- deceivers?
    Cognition Emotion, 13, 1-17
  • Frijda, N.H. (1995) Emotions in Robots from
    Comparative Approaches to Cognitive Science
    Rotiblat, H.L. Meyer, J (eds), MIT Press,
    Cambridge, US

17
References
  • Martínez-Miranda, J. Aldea, A. (2005) Emotions
    in Human and Artificial Intelligence Computers
    in Human Behavior, 21, p.323-341
  • Menzel, P. DAluisio, F. Robo Sapiens
    Evolution of a New Species (2000), MIT Press,
    Cambridge, US
  • Mori, M. (1970s), web page http//www.everything2
    .com/index.pl?node_id1687559
  • Oatley, K Jenkins, J.M. Understanding
    Emotions (1996), Blackwell Press, Oxford
  • Picard, R.W. Affective Computing (1997), MIT
    Press, Cambridge, US
  • Picard, R.W. Klein, J. (2002) Computers That
    Recognise and Respond to User Emotion
    Theoretical and Practical Implications
    Interacting With Computers, 14, p.141-169
  • Pinker, S. How The Mind Works (1997) Penguin,
    London
  • Reeves, B. Nass, C. The Media Equation (2002)
    CSLI Publications, Stanford, US
  • Rolls, E.T. Emotion Explained (2005), Oxford
    University Press, UK
  • Sloman, A. (1990) Motives, Mechanisms and
    Emotions from The Philosophy of Artificial
    Intelligence Boden, M.A. (ed), Oxford University
    Press, Oxford p.231-247
  • Sloman, A. Croucher, M. (1981) You Dont Need
    A Soft Skin To Have A Warm Heart Towards A
    Computational Analysis of Motives and Emotions,
    web page http//www.cs.bham.ac.uk/research/cogaff
    /sloman-croucher-warm-heart.pdf
  • Stern, A. (2002) Creating Emotional
    Relationships with Virtual Characters from
    Emotions In Humans and Artifacts Trappl,l R.
    Petta, P. Payr, Sabine. (eds), MIT Press,
    Cambridge, US, p.333-362
  • Sullivan, B. (2005) Drop The Mouse and Step Away
    From The PC, web page http//www.msnbc.msn.com/i
    d/7329279
  • Weizenbaum, J. Computer Power and Human Reason
    From Judgment to Calculation (1976), W.H.
    Freedman and Company, New York

18
Discussion Questions
  • Is Kismets basic emotional system an example of
    real emotions or as if emotions? Could Kismet
    be said to have a partial subset of emotions or
    is it just a simulation model?
  • If a machine can ever be said to have its own
    emotions, it arguably needs to be able to develop
    its own goals beliefs. Would we be able to
    recognise empathise with a computers own
    drives emotions or would we end up displaying
    of some form of artificial autism?
  • Although the media equation is powerful it is not
    infallible- people can override their natural
    empathic tendencies, as demonstrated by Bartneck
    in his robot replication of Milgrams
    electric-shock experiment. If machines were ever
    capable of emotions, should this change how we
    treat them?
Write a Comment
User Comments (0)
About PowerShow.com