Title: Manipulation in Human Environments
1Manipulation in Human Environments
- Aaron Edsinger Charlie Kemp
- Humanoid Robotics Group
- MIT CSAIL
2Domo
- 29 DOF
- 6 DOF Series Elastic Actuator (SEA) arms
- 4 DOF SEA hands
- 2 DOF SEA neck
- Active vision head
- Stereo cameras
- Gyroscope
- Sense joint angle torque
- 15 node Linux cluster
3Manipulation in Human Environments
Human environments are designed to match our
cognitive and physical abilities
- Work with everyday objects
- Collaborate with people
- Perform useful tasks
4Applications
- Aging in place
- Cooperative manufacturing
- Household chores
5Three Themes
- Let the body do the thinking
- Collaborative manipulation
- Task relevant features
6(No Transcript)
7Let the Body do the Thinking
- Design
- Passive compliance
- Force control
- Human morphology
8Let the Body do the Thinking
- Compensatory behaviors
- Reduce uncertainty
- Modulate arm stiffness
- Aid perception (motion, visibility)
- Test assumptions (explore)
9Let the Body Do the Thinking
10Collaborative Manipulation
- Complementary actions
- Person can simplify perception and action for the
robot - Robot can provide intuitive cues for the human
- Requires matching to our social interface
11Collaborative Manipulation
Social amplification
12Collaborative Manipulation
- A third arm
- Hold a flashlight
- Fixture a part
- Extend our physical abilities
- Carry groceries
- Open a jar
- Expand our workspace
- Place dishes in a cabinet
- Hand a tool
- Reach a shelf
13Task Relevant Features
- What is important?
- What is irrelevant?
Distinct from object detection/recognition.
14Structure In Human Environments
Donald Norman The Design of Everyday Objects
15Structure In Human Environments
Human environments are constrained to match our
cognitive and physical abilities
- Sense from above
- Flat surfaces
- Objects for human hands
- Objects for use by humans
16(No Transcript)
17Why are tool tips common?
- Single, localized interface to the world
- Physical isolation helps avoid irrelevant contact
- Helps perception
- Helps control
18(No Transcript)
19Tool Tip Detection
- Visual motor detection method
- Kinematic Estimate
- Visual Model
20(No Transcript)
21Mean Pixel Error for Automatic and Hand Labelled
Tip Detection
22Mean Pixel Error for Hand Labeled, Multi-Scale
Detector, and Point Detector
23Model-Free Insertion
- Active tip perception
- Arm stiffness modulation
- Human interaction
24Other Examples
- Circular openings
- Handles
- Contact Surfaces
- Gravity Alignment
25FutureGeneralize What You've Learned
- Across objects
- Perceptually map tasks across objects
- Key features map to key features
- Across manipulators
- Motor equivalence
- Manipulator details may be irrelevant
26RSS 2006 Workshop
- Manipulation for Human Environments
Robotics Science and Systems University of
Pennsylvania , August 19th, 2006
manipulation.csail.mit.edu/rss06
27Summary
- Importance of Task Relevant Features
- Example of the tool tip
- Large set of hand tools
- Robust detection (visual motor)
- Kinematic estimate
- Visual model
28In Progress
- Perform a variety of tasks
- Insertion
- Pouring
- Brushing
29Learning from Demonstration
30The Detector Responds To
Fast Motion
Convex
31Motion Weighted Edge Map
Video from Eye Camera
Local Maxima
32Defining Characteristics
- Geometric
- Isolated
- Distal
- Localized
- Convex
- Cultural/Design
- Far from natural grasp location
- Long distance relative to hand size
33Other Task Relevant Features?
34Detecting the Tip
35Include Scale and Convexity
36Distinct Perceptual Problem
- Not object recognition
- How should it be used
- Distinct methods and features
37(No Transcript)
38(No Transcript)
39Use The Hand's Frame
- Combine weak evidence
- Rigidly grasped
40(No Transcript)
41Acquire a Visual Model