Robot Learning From Human Demonstration - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Robot Learning From Human Demonstration

Description:

Model can generalize the exemplars to create ... Details of 'Yo-yo' Reconstruction. Waving vocabulary contains 2 primitives ... Behavior primitives are models ... – PowerPoint PPT presentation

Number of Views:285
Avg rating:3.0/5.0
Slides: 33
Provided by: wwwrobo
Category:

less

Transcript and Presenter's Notes

Title: Robot Learning From Human Demonstration


1
Robot LearningFrom Human Demonstration
Maja J Mataric Chad Jenkins, Marcelo
Kallmann, Evan Drumwright, Nathan Miller, and
Chi-Wei Chu University of Southern
California Interaction Lab / Robotics Research
Lab Center for Robotics and Embedded Systems
(CRES) http//robotics.usc.edu/agents/Mars2020/m
ars2020.html
2
Motivation Approach
  • Goals
  • Natural human-robot interaction in various
    domains
  • Automated robot programming learning by
    imitation
  • General Approach
  • Use intrinsic behavior repertoire to facilitate
    control, human-robot interaction, and learning
  • Use human interactive training method (past work)
  • Use human data-driven programming training
    methods

3
Recent Progress
  • Getting more better training data
  • a light-weight low-cost motion-capture mechanism
  • Real-world validation of the method
  • application of the method to Robonaut data
  • Application of the method I
  • synthesis of novel humanoid motion from
    automatically derived movement primitives
  • Application of the method II
  • movement classification, prediction, and
    imitation
  • The next big problem
  • humanoid motion planning around (dynamic)
    obstacles validation on Robonaut

4
Getting more better training data
IMU Motion Capture Suit
Goal Develop a low cost motion capture device
capable of logging high-DOF human motion in an
unstructured environment. Solution Use
filtered Inertial Measurement Units (IMUs) for 3
DOF tracking of each joint. Each sensor is
developed at 300.00, resulting in a suit cost
of 4200.00 to track 14 DOF Advantages 1)
Motion tracking is not coupled to off-person
emitters/detectors, so can be used outdoors,
anywhere 2) Sensors are small and networked,
allowing various configurations 3) High bandwidth
allows for real-time interaction with
visualization and simulation tools
5
Getting more better training data
IMU Motion Capture Suit
Wireless Connection
Human Body Model
Sensor Network
-sensor location
-onboardcomputer battery
6
Getting more better training data
Suit Details
Specifications Atmel 8 bit microcontroller w/
10 bit ADC, 8 Mhz (3) 300 deg/sec Gyroscopes
(3) 2-G Accelerometers 200.00/sensor (2) DOF
FilteredNext Revision (3) Honeywell
Magnetometers Change to 12 bit ADC, 16Mhz CPU
260.00/sensor Full 3 DOF Filtered
REV 1
1.5
Filter by Eric Bachmann _at_ MOVES Institute, Naval
Postgraduate School
7
Automatically Deriving Behaviors
Recap of the method
  • Input kinematic motion time series of joint
    angles
  • Motion segmentation
  • Partition input motion into conceptually
    indivisible motion segments
  • Grouping of behavior exemplars
  • Spatio-temporal Isomap dimension reduction and
    clustering
  • Generalizing behaviors into forward models
  • Interpolation of a dense sampling for each
    behavior
  • Meta-level exemplar grouping
  • Additional embedding iterations for higher level
    behaviors

O. C. Jenkins, M. J Mataric, Automated
Derivation of Behavior Vocabularies for
Autonomous Humanoid Motion", Autonomous Agents
and Multiagent Systems, Melbourne, Australia,
July 14-16, 2003.
8
Applying the Method to Robonaut
Validation of the method on Robonaut
  • Work with Alan Peters, more tomorrow
  • 80-D data from tactile and force
    sensors
  • 5 tele-op grasps of a horizontal wrench
  • 460 frames each, 2300 total
  • Applied sequentially continuous
    ST-Isomap

PCA Embedding, not informative
ST-Isomap embedding
ST-Isomap Distance Matrix
9
Uncovering Structure in the Data
Validation of the method on Robonaut
ST-Isomap embedding
Mapping of a new grasp motion onto the derived
embedding structure is retained
Useful for monitoring performance, data analysis,
generating controllers, etc.
10
Using Derived Behaviors
  • We now have a method for deriving vocabularies of
    behaviors from kinematic time-series of human
    motion
  • each primitive is a nonparametric exemplar-based
    motion model
  • each primitive can be eagerly evaluated to encode
    nonlinear dynamics in joint angle space
  • We can use those derived behaviors for motion
    synthesis, prediction, and classification
  • Our recent work applied the behaviors toward
  • individually indexing to provide state prediction
  • providing desireds for control
  • matching against observed motion for
    classification and learning

11
Forward Model Motion Synthesis
Use of the method generating movement
  • Controller has a set of primitive behaviors
  • Arbitrator decides which primitive to activate
    (e.g., based on transition probabilities)
  • The active primitive incrementally updates the
    robots current pose (i.e., sets the desireds)
  • Controller can generate motion indefinitely

12
Representation of the Behaviors
Use of the method generating movement
  • Behavior primitives are manifold-like flow fields
    in joint angle space, temporal ordering creates
    the flow field gradient
  • This representation is a forward model, allowing
    for motion to be indexed, predicted, and
    synthesized dynamically, in real-time
  • Model can generalize the exemplars to create
    novel motion

Blue exemplar trajectories Black to Red
interpolated motion creating the temporal
gradient flow field Right 3 main PCs of a
primitive flow field in joint angle space
13
Example 1Primitive-Based Synthesis
Use of the method generating movement
  • Three motions generated from the same primitive
    behavior,
  • using different starting poses, showing flow and
    variation

PCA-view of primitive flow field in joint angle
space
Resulting kinematic motion
14
Example 2 Primitive-Based Synthesis
Use of the method generating movement
  • Motion generated by combining two primitives
    (wave-in and wave-out) with a high-level
    arbitrator that sequences their activation

arm waving
15
Examples 78 Beh.-Based Synthesis
Use of the method generating movement
Multi-activity (take 2) cabbage patch ? twist
Multi-activity (take 1) cabbage patch ?
twist
Single activity reaching (no root info)
16
Synthesis from Isolated Activities
Use of the method generating movement
cabbage patch (20000 frames)
jab punching (5000 frames)
combined punching (3400 frames)
jab punching (view 2)
17
Behavior Classification Imitation
Use of the method classifying movement
  • Goal use the primitive behaviors to recognize,
    classify, predict, and imitate/reconstruct
    observed movement
  • Compare observed motion with predictions from
    behavior primitives
  • Use Euclidean distance between end-effector
    positions as a metric
  • Use a Bayesian classifier
  • Reconstruct/imitate the observed movement
  • Concatenate best match trajectories from
    classified primitives to reconstruct/imitate

18
Classification Imitation Schematic
Use of the method classifying movement
19
Example Yo-yo Imitation
Use of the method imitating movement
Observed yo-yo motion (from MegaMocap V2)
yo-yo reconstruction from waving
yo-yo from the twist
yo-yo from punching
yo-yo from cabbage patch
20
Details of Yo-yo Reconstruction
Use of the method classifying movement
  • Waving vocabulary contains 2 primitives
  • dark red (wave down) and light red (wave up)
  • Predicted end-effector location is matched
    against observed end-effector location
  • green (current trajectory horizon)

21
Probabilistic Behavior Classification
Use of the method classifying movement
  • Behavior primitives are models
  • Can use the flow-field representation or, in this
    case, radial basis functions were used
  • We can apply a Bayesian classifier P(CX)
    P(XC)P(C)
  • C is a class (behavior)
  • X is an observation (joint angles)
  • P(XC) can be determined from the primitives
  • Classifier operates in real-time on joint-angle
    data
  • Applications human avoidance, interactive tasks
    with human operators and collaborators and/or
    other robots

E. Drumwright, M. J Mataric, Generating and
Recognizing Free-Space movements in Humanoid
Robots", IEEE/RSJ Int. Conf. on Intelligent
Robotics and Systems (IROS-2003), Las Vegas,
Nevada, Oct 25-30, 2003.
22
Bayesian Behavior Classification
Use of the method classifying movement
  • Model is a distribution of joint angles over
    time (below)
  • Actual distribution is multivariate (variables
    DOF used by primitive behaviors)

Mixture spaces between 2 exemplars of the jab
primitive for (left) one shoulder DOF and
(right) 2nd shoulder DOF
23
Bayesian Classification Results
Use of the method classifying movement
  • Classification of novel movement is highly
    accurate

24
Humanoid Motion Planning
Next problem humanoid motion planning
  • Goal
  • Synthesize real-time humanoid collision-free
    motion in dynamic environments
  • Approach
  • Use demonstrated motion data to compute a
    meaningful representation of valid motions
  • This enables- fast determination of
    collision-free paths- adaptation to new obstacles

25
Humanoid Motion Planning
Next problem humanoid motion planning
  • Approach
  • Use pre-computed probabilistic roadmaps to
    represent the valid motion space of the humanoid
  • Temporarily disable parts of the roadmap that
    are invalid when obstacles are perceived. If the
    remaining part is not enough, perform on-line
    planning.
  • Contribution
  • Introduction of dynamic roadmaps for motion
    planning, joining the advantages of multi-query
    methods (PRMs, PRTs, VGs) and single-query
    methods (RRTs, Exp. Spaces, SBLs).
  • Solutions for the humanoid case, e.g., the use of
    demonstrated motions to construct suitable
    roadmaps

26
Details of the Approach (1/3)
Next problem humanoid motion planning
  • Roadmap computation
  • In high dimensional configuration space,
    comprising both arms and torso
  • Pre-computed using PRM sampling
  • Use density limits to achieve uniform sampling of
    end-effectors positions in the reachable
    workspace
  • Sample postures in the subspace covered by the
    demonstrated data (current work)
  • Even without considering obstacles the roadmap is
    useful for deriving motions without
    self-collisions

22 DOFs
17 DOFs
27
Details of the Approach (2/3)
Next problem humanoid motion planning
  • On-line roadmap maintenance
  • When obstacles are detected, invalid edges and
    nodes are disabled
  • Workspace cell decomposition is used for fast
    localization of invalid nodes and edges
  • The time required to update the roadmap depends
    on the complexity of the environment and robot
    (collision detection)
  • Trade-offs with on-line planning roadmap update
    is not suitable to highly dynamic environments,
    but fine for pick and place applications

28
Details of the Approach (3/3)
Next problem humanoid motion planning
  • On-line query
  • A-like graph search quickly finds a path to the
    nearest node of the goal posture
  • If the node cannot be directly connected to the
    goal posture, on-line single-query planning is
    used (currently using bi-directional RRTs)
  • Better results when few portions of the roadmap
    are invalidated
  • Worst cases achieve similar results to using
    single-query planning alone

29
Validation Results With Robonaut
Next problem humanoid motion planning
  • Example motions
  • Visualization geometry 23930 triangles
  • Collision geometry 1016 triangles
  • Optimization (smoothing) takes about 0.3s

(Pentium III 2.8 GHz)
30
Path Optimization (1/2)
Next problem humanoid motion planning
  • Incremental path linearization
  • Simple and efficient in most cases
  • May be time-consuming as collision detection must
    be invoked before each local linearization.

31
Path Optimization (2/2)
Next problem humanoid motion planning
  • Incremental path linearization
  • Simple and efficient in most cases
  • May be time-consuming as collision detection must
    be invoked before each local linearization.
  • Sub-configuration linearization may be required,
    e.g., to decouple arm motions

32
Summary
  • Getting more better data
  • A light-weight low-cost motion-capture mechanism
  • Real-world validation of the method
  • Successful application of the method to Robonaut
    data
  • Application of the method I
  • Successful synthesis of novel humanoid motion
    from automatically derived movement primitives
  • Application of the method II
  • Efficient movement classification, prediction,
    and imitation
  • The next big problem
  • Humanoid motion planning around obstacles
    validated on Robonaut, dynamic obstacles to be
    addressed next

33
Contributors and More Info
Chad Jenkins
Marcelo Kallmann
  • Additional info, papers, videos
  • http//robotics.usc.edu/agents/Mars2020/mars2020.
    html

Chi-Wei Chu
Evan Drumwright
Nathan Miller
Write a Comment
User Comments (0)
About PowerShow.com