Understanding Naturally Conveyed Explanations of Device Behavior - PowerPoint PPT Presentation

About This Presentation
Title:

Understanding Naturally Conveyed Explanations of Device Behavior

Description:

Spring length = 2.3cm. Rest length = 3.0cm. Description: Mechanical ... Set spring length. Michael Oltmans. Roadmap. The problem. Our approach. Implementation ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 36
Provided by: dav8235
Learn more at: http://www.ai.mit.edu
Category:

less

Transcript and Presenter's Notes

Title: Understanding Naturally Conveyed Explanations of Device Behavior


1
Understanding Naturally Conveyed Explanations of
Device Behavior
  • Michael Oltmans and Randall Davis
  • MIT Artificial Intelligence Lab

2
Roadmap
  • The problem
  • Our approach
  • Implementation
  • System architecture
  • How ASSISTANCE interprets descriptions
  • Demonstrating understanding
  • Evaluation and contributions
  • Related and future work

3
Sketches Models
  • We have a sketch of a device
  • A simulation model can be generated from the
    sketch
  • Life is good or is it?

4
(No Transcript)
5
The Problem
  • No representation of intended behavior
  • People talk and sketch but the computer doesnt
    understand

6
Task
  • Understand descriptions of device behavior
  • Given
  • A model of the devices structure
  • A natural explanation of the behavior
  • Generate a causal model of behavior

7
(No Transcript)
8
Roadmap
  • The problem
  • Our approach
  • Implementation
  • System architecture
  • How ASSISTANCE interprets descriptions
  • Demonstrating understanding
  • Evaluation and contributions
  • Related and future work

9
Naturally Conveyed Explanations
  • Natural input modalities
  • Sketched devices
  • Sketched gestures
  • Speech
  • Natural content of descriptions
  • Causal
  • Behavioral

10
Example Describing the Behavior of a Spring
11
Example Describing the Behavior of a Spring
12
Example Describing the Behavior of a Spring
13
Example Describing the Behavior of a Spring
14
Sources of power
  • Conventions in explanations aide interpretation
  • Description order suggests causal order
  • Constrained vocabulary
  • Overlapping descriptions provide constraints on
    interpretations

15
Roadmap
  • The problem
  • Our approach
  • Implementation
  • System architecture
  • How ASSISTANCE interprets descriptions
  • Demonstrating understanding
  • Evaluation and contributions
  • Related and future work

16
Sketch
Speech
  • ASSIST
  • Recognize sketch
  • ViaVoice
  • Recognize speech
  • Parse
  • ASSISTANCE
  • Interpret explanation
  • LTRE
  • Truth Maintenance
  • Rule System

Causal Model and Simulation
17
Outputs
  • Consistent causal model
  • Tree
  • Nodes are events
  • Links indicate causal relationships
  • Demonstration of understanding
  • Natural language descriptions of causality
  • Parameter constraints

18
The Representation of Utterances
  • Input comes from ViaVoice
  • Grammar constructed based on observed
    explanations
  • Tagged with parts of speech and semantic
    categories

19
Representing the parse tree
body 1 pushes body 2
SENTENCE SIMPLE_SENTENCE ( body 1 pushes body
2 (S0) t1)
SUBJECT NOUN NOUN-PHRASE ( body 1 (S0 t1) t2)
VERB_PHRASE ( pushes body 2 (S0 t1) t3)
DIRECT_OBJECT NOUN NOUN-PHRASE ( body 2 (S0 t1
t3) t5)
PROPELS VERB ( pushes (S0 t1 t3) t4)
20
Steps In Interpreting Explanations
  • Infer motions from annotations and build event
    representations
  • Find causal connections
  • Search for consistent causal structures
  • Pick best causal structure

21
Step 1 Inferring Motions from Annotations
  • Inputs
  • Arrows
  • Utterances
  • moves, pushes, the spring releases
  • Outputs
  • (moves body-1 moves-body-1-394)
  • (describes arrow-2 moves-body-1-394)

22
Inferring Motion From Arrows
  • Rule triggers
  • Arrow
  • Arrow referent (i.e. a body)
  • The body is mobile
  • Rule body records that
  • The body moves
  • The arrow describes the path

23
Inferring Motion From Arrows
(rule ((TRUE (arrow ?arrow) VAR ?f1)
(TRUE (arrow-referent ?arrow ?body) VAR
?f2) (TRUE (can-move ?body) VAR ?f3)
(TRUE (name ?name ?body)))
(rlet ((?id (new-id Moves ?name)))
(rassert! (implies (AND ?f1 ?f2
?f3) (AND (moves ?body ?id)
(describes ?arrow ?id))) ARROW-IS-MOTION)))
24
Multi-Modal References
  • Match a sentence whose subject is this and a
    pointing gesture
  • Assert the referent as the subject of the
    sentence
  • Limitations
  • User must point at referent before the utterance
  • Allow one this per utterance

25
Redundant Events
  • Redundant explanations lead to multiple move
    statements for some events
  • Merge them into a unique event statement

Body 1 falls
26
Step 2 Find Causal Connections
  • Plausible causes
  • Arrow indicating motion near another object
  • Exogenous forces
  • Definite causes
  • When then utterances
  • Body 1 pushes body 2

27
Step 3 Search for Consistent Causal Structures
  • Some events have several possible causes
  • Find consistent causal chains
  • Search
  • Forward looking depth-first-search
  • Avoids repeating bad choices by recording bad
    combinations of assumptions

28
Step 4 Find the Best Interpretation
  • Filter out interpretations that have unnecessary
    exogenous causes
  • Pick the interpretation that most closely matches
    the explanation order
  • While there are multiple valid interpretations
  • Choose one event with multiple possible causes
  • Assume the causal relation whose cause has the
    earliest description time

29
Answer Queries and Adjust Parameters
  • Queries
  • Designer What is body 2 involved in?
  • ASSISTANCE The motion of body 3 causes the
    motion of body 2 which causes the motion of body
    5
  • Parameter Adjustment
  • Set spring length

30
Roadmap
  • The problem
  • Our approach
  • Implementation
  • System architecture
  • How ASSISTANCE interprets descriptions
  • Demonstrating understanding
  • Evaluation and contributions
  • Related and future work

31
Limitations of the Implementation
  • Scope of applicability restricted
  • State transitions are one step deep
  • Cannot handle conjunctions of causes
  • Limited knowledge about common device patterns
  • Latches, linkages, etc
  • Supports and prevents
  • Natural language limitations
  • Use a full featured NL system like START
  • Formally determine the grammar

32
Evaluation of the Approach
  • Advantages
  • Focus on behavior in accordance with survey
    results
  • Move away from rigidity of WIMP interfaces
  • Similar to person-to-person interaction
  • Alternatives
  • More dialog and feedback
  • Natural vs. efficient
  • Open claim that the domain is adequately
    constrained

33
Contributions
  • Understanding naturally conveyed descriptions of
    behavior
  • Generating representations of device behavior
  • Match the designers explanation
  • Generate simple explanations of causality
  • Allow the calculation of simulation parameters

34
Related Work
  • Understanding device sketches
  • Alvarado 2000
  • Multimodal interfaces
  • Oviatt and Cohen
  • Causality
  • C. Rieger and M. Grinberg 1977

35
Future Work
  • Direct manipulation
  • Dialog
  • Expand natural language capabilities
  • Smart design tools
Write a Comment
User Comments (0)
About PowerShow.com