Dung N' Lam and K' Suzanne Barber - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Dung N' Lam and K' Suzanne Barber

Description:

The University of Texas at Austin. http://www.lips.utexas.edu ... 2005 THE UNIVERSITY OF TEXAS AT AUSTIN. Laboratory for Intelligent Processes and Systems ... – PowerPoint PPT presentation

Number of Views:24
Avg rating:3.0/5.0
Slides: 29
Provided by: dnl9
Category:
Tags: at | austin | barber | dung | lam | of | suzanne | texas | university

less

Transcript and Presenter's Notes

Title: Dung N' Lam and K' Suzanne Barber


1
Automated Interpretation of Agent Behavior
  • Dung N. Lam and K. Suzanne Barber

2
Motivation
target
  • Agent software is sophisticated
  • autonomous decision-making,
  • dynamic intentions,
  • many decision-influencing factors,
  • interaction among agents.
  • Software Comprehension is crucial for software
  • development,
  • maintenance (e.g., debugging, testing, and
    improving),
  • redesign,
  • adoption by the end-user.

UAV
3
Software Comprehension Problem
user designer, developer, or end-user
expected agent behavior
K updatemanual (K, D, I, Os)
  • Is my comprehension representative of the
    implementation?
  • Is the implementation representative of expected
    behavior (or the design)?

4
Reverse Engineering
  • Traditional Reverse Engineering (RE) tools
  • gather static and run-time data (e.g.,
    SoftSpec)
  • enable efficient code browsing (e.g., Doxygen,
    Rigi)
  • generate visualizations (interpretations)
    (e.g., GUPRO)

expected agent behavior
design models and documentation
representative?
execute
implementation(source code)
data
scenario
Execution threads, stack traces, method
invocations
Os observe (execute (I,s)) Ns interpretRE (Os)
  • Issues
  • Overloading the user with detailed data.
  • User must digest and reason about the softwares
    behavior.

5
Objectives
  • High-level feedback to ease comprehension for all
    users.
  • Automated data-processing to off-load reasoning
    by the human user.

6
Tracer Solution Requirements
  • To automate reasoning,
  • need formalized background knowledge K about how
    the agent is expected to behave
  • need observations (at a high-level of
    abstraction) of what the agent is actually doing.

expected agent behavior
1. need to formalize
design models and documentation
execute
implementation(source code)
scenario
2. need to abstract
7
Tracer Solution
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
  • Tracer provides to the user
  • interpretations of actual agent behavior,
  • suggestions for updating background knowledge
    (K), and
  • explanations for observed agent behavior.

8
Background Knowledge
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
9
Background Knowledge Representation
  • Agent Concepts
  • high-level agent constructs used in agent designs
    and models
  • familiar to the designer, developer, and
    end-user
  • used to explain and comprehend agent behavior
  • (Causal) Relations connecting agent concepts.

10
Formalizing Background Knowledge
design models and documentation
K represents expected agent behavior
11
Abstracting Observations
  • Employ Reverse Engineering approach
  • to capture execution traces of the software
    (e.g., SCED)
  • issue user is overloaded with detailed data
  • Log agent activities at the agent concept level
  • reduces amount of data
  • brings source code details to design level
  • addresses translation gap problem

Tracer
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
manifestation of actual agent behavior
12
Representative Accuracy
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
Enables semi-automated updates to K to account
for implementation changes in agent behavior
K update (K, D, Ns, k) k suggest (Ns)
13
Solution Tracing Method
  • Initialize K
  • Os observe (execute (I,s))
  • Ns interpret (K, Os)
  • k suggest (Ns)
  • K update (K, D, Ns, k)
  • Tracer Tool assists user throughout the cycle.

14
Tracing Method Initialize K
  • Initialize with concepts (by adding agent
    concepts and relations to K) based on collected
    data about system behavior

intention
action 1
goal
action 2
belief 2
belief 1
event
message
Tracer
backgroundknowledge
design models and documentation
Incomplete K is okay
15
Tracing Method Acquiring Observations
Os observe (execute (I,s))
instrumented with logging code
Tracer
backgroundknowledge
design models and documentation
implementation(source code)
user-specified scenario
16
Run Agent System to Get Observations
target
  • All logged observations are sent to the Tracer
    Tools logging server, where observations are
    sorted and pre-processed.

agent
17
Tracing Method Interpret
Ns interpret (K, Os)
  • interpret the observations collected from
    testing,using K to impose relations between
    observations.
  • Analogous to how programmers debuge.g.,
    message(x) caused belief(x)

Tracer
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
18
UAV Agent Behavior Patterns
Observations
19
Tracing Method Suggestions
K update (K, D, Ns, k), where k suggest (Ns)
  • Cycle repeats until user is satisfied with level
    of detail represented in K and K is complete.
  • If needed, suggestions are created for updating K.

All suggestions are reviewed by the user.
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
20
Suggesting Relations
?
  • Initiated for observations without an incoming
    relation (resulting from incomplete K or
    implementation changes).
  • Suggest-Relation Algorithm for an observation
  • searches backward (temporally) through
    observation list
  • uses heuristics to determine whether two
    observations are related by comparing observation
    attributes.
  • Action?Event heuristic
  • If the observation is an event, then suggest a
    relation from the latest action with common
    attributes that doesn't already have an outgoing
    relation.

21
Suggested Relations for the UAV Domain
Incorrect relation
Action flyToTarget ? Event uavScan
Event uavScan ? Belief servicedTarget
22
Case Study Results
  • 2 domains Simple (500 LOC), UAV (20k LOC)
  • Began with essentially an empty K.
  • 7 heuristics
  • Results are promising.

No presumed behavioral knowledge.
Agent system developed by third party.
Not many more concepts. Scalability based on
number of agent concepts.
Heuristics can be added and improved to identify
more correct suggestions, but at the expense of
generality.
23
Advancing Comprehension
How can Tracer further help the user comprehend?
Interpretations
  • Since comprehension can be measured by how much
    one can explain,Tracer can generate explanations
    for any observation.

24
Automated Explanation Generation
backgroundknowledge
design models and documentation
execute
implementation(source code)
scenario
? explain (m, Ns)
25
Tracer Explanation Generation
? explain (m, Ns)
Interpretation
in terms of agent concepts
26
Completeness problems
  • Between expected and actual behaviors (K vs. Ns).

27
Closing Remarks
  • Tracer automates visualizing and explaining
    observed agent behavior in terms of agent
    concepts to help the user in debugging,
    maintaining, and comprehending the agent system.
  • Limitations
  • Not exhaustive analysis of system stateOnly
    operates within the scenario set chosen by user.
  • Requires structural knowledge of the source code
    to insert the logging code in the correct
    locations.

28
Thank you.
  • Questions or Comments?
  • dnlam_at_lips.utexas.edu
  • Demonstration of Tracer Tool on Friday at 1030AM
Write a Comment
User Comments (0)
About PowerShow.com