Title: Automated Detection of Deception and Intent
1Automated Detection of Deception and Intent
- Judee Burgoon, Ed.D.
- Center for the Management of Information
- University of Arizona
19MAR04
2Collaborative Partners
DETECTING DECEPTION IN THE MILITARY INFOSPHERE
Funded by Department of Defense
Center for the Management of Information,
University of Arizona Center for
Computational Bioengineering, Imaging and
Modeling, Rutgers University Funded by
Department of Homeland Security
AUTOMATED INTENT DETECTION
3Deception and Intent Defined
- Deception is a message knowingly transmitted with
the intent to foster false beliefs or
conclusions. - Hostile intent refers to plans to conduct
criminal or terrorist activity - Intent is inferred from
- suspicious behavior
- overt hostility
- deception
4Relationship of Deception to Intent
5 Many Ways To Deceive
- Lies
- Fabrications
- Concealments
- Omissions
- Misdirection
- Bluffs
- Fakery
- Mimicry
- Tall tales
- White lies
- Deflections
- Evasions
- Equivocation
- Exaggerations
- Camouflage
- Strategic ambiguity
- Hoaxes
- Charades
- Imposters
6America under Attack!
7Statement of the Problem
- Humans have very poor ability to detect deceit
and hostile intent. - True of experts as well as untrained individuals
- Accuracy rates of 40-60--about the same as
flipping a coin -
- Reliance on new communication technologies--text,
audio, video--may make us more vulnerable to
deceit.
8In the Headlines
9Questions
- Are there reliable indicators of
- deceit?
- intent to engage in hostile actions?
- Can detection be automated to augment human
abilities? - Does mode of communication make a difference?
10The Objective Reducing False Alarms Misses
11Sample Deception Indicators
- Arousal
- Higher pitch, faster tempo
- Emotion
- Absence of emotional language, false smiles
- Cognitive effort
- Delays in responding, nonfluent speech
- Memory
- Fewer details, briefer messages
- Strategic communication
- Controlled movement, increasing involvement
12Our Experiments
16 exper-iments, 2136 subjects, in 2.5 years
13Typical Experiment Mock Theft
- Task
- half of participants steal wallet from
classroom, other half are innocents - all are interviewed by trained and/or untrained
interviewers - Mode of interaction
- face-to-face, text, audio, video
- Outcomes
- accuracy in detecting truth and deception
- judged credibility
- coding of verbal and nonverbal behavior
14Sample Results
- Deceivers create longer messages under text than
FtF.
15Implications
- Text-based deception allows for planning,
rehearsal, editing. - Deceivers can use text messages to their
advantage.
16Questions
- Are there reliable text-based indicators of
deceit or hostile intent? - Can these be automated to overcome deceivers
advantages?
17Sample Text-based Cues We Analyzed
18Sample Results from Automated Analysis
- Deceivers use different language than truth
tellers. - Deceiversmore
- quantity
- uncertainty
- references to others
- informality
- Truthtellersmore
- diversity
- complexity
- positive affect
- references to self
19Automating AnalysisAgent99 Parser
- Find cues in text
- Submit to data mining tool
20Decision Tree Analysis
21Accuracy in Detecting Deceit
Note Preliminary findings from Mock Theft, from
transcribed face-to-face sessions
22Implications
- Linguistic and content features together can
reliably identify deceptive or suspicious
messages. - Text analysis can be successfully automated.
23Questions
- Can hostile intent be mapped to behavior?
- Are there reliable video-based indicators of
deceit and intent? - Are the indicators open to automation?
24The Mapping Problem
25Approach to Analysis
- Four data sets
- Pre-polygraph interviews from actual
investigations - Mock theft experiment
- Two states innocent (truthful), deceptive
(guilty) - Actors in airport/screening location scenarios
- Three states relaxed, agitated (nervous),
overcontrolled - Actors showing normal behavior to train neural
networks
26Intent Recognition from Video
- Track and estimate human movement including
- Head
- Facial Head Features
- Hands
- Body
- Legs
- Tracking techniques
- Physics-based tracking of face and hands
- Statistical model-based motion
- estimation
27Skin Color Tracker Face Hands
28Sample Results from Human Coders
- Thieves use fewer head movements and gestures,
more self-touching than innocents.
29Sample Patterns Actors
Head pos.
L. hand pos.
R. hand pos.
Head pos.
L. hand pos.
R. hand pos.
Head pos.
L. hand pos.
R. hand pos.
Head vel.
L. hand vel.
R. hand vel.
Head vel.
L. hand vel.
R. hand vel.
Head vel.
L. hand vel.
R. hand vel.
controlled
relaxed
nervous
30Sample Patterns Mock Thieves
Nervous (lying)
Relaxed (not lying)
31Sample Results Scores differ among relaxed,
agitated, and overcontrolled suspects
32Summary
- Humans are fallible in detecting deception and
hostile intent - Automated detection tools to augment human
judgment can greatly increase detection accuracy - Verbal and nonverbal behaviors have been
identified that - Can be automated
- Together significantly improve detection accuracy
- More research under a variety of contexts will
determine which indicators and systems are the
most reliable