Automated Detection of Deception and Intent - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Automated Detection of Deception and Intent

Description:

Center for the Management of Information, University of Arizona. Center for Computational Bioengineering, Imaging and Modeling, Rutgers University ... – PowerPoint PPT presentation

Number of Views:91
Avg rating:3.0/5.0
Slides: 33
Provided by: myu
Category:

less

Transcript and Presenter's Notes

Title: Automated Detection of Deception and Intent


1
Automated Detection of Deception and Intent
  • Judee Burgoon, Ed.D.
  • Center for the Management of Information
  • University of Arizona

19MAR04
2
Collaborative Partners
DETECTING DECEPTION IN THE MILITARY INFOSPHERE
Funded by Department of Defense
Center for the Management of Information,
University of Arizona Center for
Computational Bioengineering, Imaging and
Modeling, Rutgers University Funded by
Department of Homeland Security
AUTOMATED INTENT DETECTION
3
Deception and Intent Defined
  • Deception is a message knowingly transmitted with
    the intent to foster false beliefs or
    conclusions.
  • Hostile intent refers to plans to conduct
    criminal or terrorist activity
  • Intent is inferred from
  • suspicious behavior
  • overt hostility
  • deception

4
Relationship of Deception to Intent
5
Many Ways To Deceive
  • Lies
  • Fabrications
  • Concealments
  • Omissions
  • Misdirection
  • Bluffs
  • Fakery
  • Mimicry
  • Tall tales
  • White lies
  • Deflections
  • Evasions
  • Equivocation
  • Exaggerations
  • Camouflage
  • Strategic ambiguity
  • Hoaxes
  • Charades
  • Imposters

6
America under Attack!
7
Statement of the Problem
  • Humans have very poor ability to detect deceit
    and hostile intent.
  • True of experts as well as untrained individuals
  • Accuracy rates of 40-60--about the same as
    flipping a coin
  • Reliance on new communication technologies--text,
    audio, video--may make us more vulnerable to
    deceit.

8
In the Headlines
9
Questions
  • Are there reliable indicators of
  • deceit?
  • intent to engage in hostile actions?
  • Can detection be automated to augment human
    abilities?
  • Does mode of communication make a difference?

10
The Objective Reducing False Alarms Misses
11
Sample Deception Indicators
  • Arousal
  • Higher pitch, faster tempo
  • Emotion
  • Absence of emotional language, false smiles
  • Cognitive effort
  • Delays in responding, nonfluent speech
  • Memory
  • Fewer details, briefer messages
  • Strategic communication
  • Controlled movement, increasing involvement

12
Our Experiments
16 exper-iments, 2136 subjects, in 2.5 years
13
Typical Experiment Mock Theft
  • Task
  • half of participants steal wallet from
    classroom, other half are innocents
  • all are interviewed by trained and/or untrained
    interviewers
  • Mode of interaction
  • face-to-face, text, audio, video
  • Outcomes
  • accuracy in detecting truth and deception
  • judged credibility
  • coding of verbal and nonverbal behavior

14
Sample Results
  • Deceivers create longer messages under text than
    FtF.

15
Implications
  • Text-based deception allows for planning,
    rehearsal, editing.
  • Deceivers can use text messages to their
    advantage.

16
Questions
  • Are there reliable text-based indicators of
    deceit or hostile intent?
  • Can these be automated to overcome deceivers
    advantages?

17
Sample Text-based Cues We Analyzed
18
Sample Results from Automated Analysis
  • Deceivers use different language than truth
    tellers.
  • Deceiversmore
  • quantity
  • uncertainty
  • references to others
  • informality
  • Truthtellersmore
  • diversity
  • complexity
  • positive affect
  • references to self

19
Automating AnalysisAgent99 Parser
  • Find cues in text
  • Submit to data mining tool

20
Decision Tree Analysis
21
Accuracy in Detecting Deceit
Note Preliminary findings from Mock Theft, from
transcribed face-to-face sessions
22
Implications
  • Linguistic and content features together can
    reliably identify deceptive or suspicious
    messages.
  • Text analysis can be successfully automated.

23
Questions
  • Can hostile intent be mapped to behavior?
  • Are there reliable video-based indicators of
    deceit and intent?
  • Are the indicators open to automation?

24
The Mapping Problem
25
Approach to Analysis
  • Four data sets
  • Pre-polygraph interviews from actual
    investigations
  • Mock theft experiment
  • Two states innocent (truthful), deceptive
    (guilty)
  • Actors in airport/screening location scenarios
  • Three states relaxed, agitated (nervous),
    overcontrolled
  • Actors showing normal behavior to train neural
    networks

26
Intent Recognition from Video
  • Track and estimate human movement including
  • Head
  • Facial Head Features
  • Hands
  • Body
  • Legs
  • Tracking techniques
  • Physics-based tracking of face and hands
  • Statistical model-based motion
  • estimation

27
Skin Color Tracker Face Hands
28
Sample Results from Human Coders
  • Thieves use fewer head movements and gestures,
    more self-touching than innocents.

29
Sample Patterns Actors
Head pos.
L. hand pos.
R. hand pos.
Head pos.
L. hand pos.
R. hand pos.
Head pos.
L. hand pos.
R. hand pos.
Head vel.
L. hand vel.
R. hand vel.
Head vel.
L. hand vel.
R. hand vel.
Head vel.
L. hand vel.
R. hand vel.
controlled
relaxed
nervous
30
Sample Patterns Mock Thieves
Nervous (lying)
Relaxed (not lying)
31
Sample Results Scores differ among relaxed,
agitated, and overcontrolled suspects
32
Summary
  • Humans are fallible in detecting deception and
    hostile intent
  • Automated detection tools to augment human
    judgment can greatly increase detection accuracy
  • Verbal and nonverbal behaviors have been
    identified that
  • Can be automated
  • Together significantly improve detection accuracy
  • More research under a variety of contexts will
    determine which indicators and systems are the
    most reliable
Write a Comment
User Comments (0)
About PowerShow.com