Vision%20Based%20Motion%20Control - PowerPoint PPT Presentation

About This Presentation
Title:

Vision%20Based%20Motion%20Control

Description:

Vision based motion control. Programming and solving whole human tasks ... Real time. On special embedded computers or general workstations. Different special HW ... – PowerPoint PPT presentation

Number of Views:122
Avg rating:3.0/5.0
Slides: 64
Provided by: martinja
Category:

less

Transcript and Presenter's Notes

Title: Vision%20Based%20Motion%20Control


1
Vision Based Motion Control
  • Martin Jagersand
  • University of Alberta
  • CIRA 2001

2
Vision Based Motion Control
  • Martin Jagersand
  • University of Alberta
  • CIRA 2001

3
Content
  1. Vision based motion control
  2. Programming and solving whole human tasks
  3. Software systems for vision and control
  4. Discussion

4
1. How to go from Visual sensationto Motor
action?gt
  • Camera -gt Robot coord Robot -gt Object

5
Closed loop traditional visual servoing
EE
  • This talk focus on estimating the geometric
    transforms

6
Lots of possible coordinates
  • Camera
  • Frame at projection center
  • Many different models
  • Robot
  • Base frame
  • End-effector frame
  • Object frame

Traditional modeling PP1(ltparamsgt)
P2(ltparamsgt) Pn(ltparamsgt)
7
Hand-Eye system
  • Motor-Visual function yf(x)
  • Jacobian J( dfi / dxj )

8
RecallVisual specifications
  • Point to Point task error

Why 16 elements?
9
Visual Servoing
  • Observed features
  • Motor variables
  • Local linear model
  • Visual servoing steps 1 Solve
  • 2 Move

10
Find J Method 1 Test movements along basis
  • Remember J is unknown m by n matrix
  • Assume movements
  • Finite difference

11
Find J Method 2Secant Constraints
  • Constraint along a line
  • Defines m equations
  • Collect n arbitrary, but different measures y
  • Solve for J

12
Find J Method 3Recursive Secant Constraints
  • Based on initial J and one measure pair
  • Adjust J s.t.
  • Rank 1 update
  • Consider rotated coordinates
  • Update same as finite difference for n orthogonal
    moves

13
Trust region of J estimate
  • Let be the trust region at time t
  • Define a model agreement
  • Update the trust region recursively

Where dupper and are dlower predefined constants
14
Visual Servoing Steps
  1. Solve
  2. Update and move
  3. Read actual visual move
  4. Update Jacobian

repeat
15
Visual Servoing Steps
  1. Solve
  2. Update and move
  3. Read actual visual move
  4. Update Jacobian

repeat
16
Jacobians Spline model of underlying non-linear
function
  • Over time acquires several Jacobians J
  • Each J a hyperplane
  • Collection of Js form a (sparse) piecewise
    linear spline

17
Jacobian based visual model
  • Assume visual features mgtgtn motor freedoms
  • All visual change restricted to n freedoms by
  • Can predict visual change
  • Can also parameterize x visually

18
Related visual modelAffine model
e1
O
e2
e3
  • Affine basis
  • Image projection of origin
  • Image basis

19
Find affine coordinates
e1
q
O
e2
e3
  • Observe (track) y through time
  • Solve an equation system to find q
  • Reprojection Have q,want y

20
Relation Affine Jacobian image models
  • Rewrite affine model

21
Composite affine and Jacobian model
  • Chain the affine and Jacobian model
  • Represents rigid objects in arbitrary motor frame

22
Transforms Affine-Jacobian model
  • Measurement matrix
  • Affine coordinate equation

23
ExperimentAffine animation of rigid structure
24
Affine vs. Visual-Motor
25
Other sensory modalities Force and contact
manipulation
  • Accuracy is limited by Visual tracking
  • and Visual goal specification
  • Specifying well defined visual encodings can be
    difficult
  • Limited to non-occluded settings
  • Not all tasks lend themselves to visual
    specification.

26
Constraint Geometry
  • Impact force along surface normal
  • Sliding motion
  • 3rd vector

27
Constraint Frame
  • With force frame tool frame we get
  • Assume frictionless gt Can update each time step

P2
P1
P3
28
Hybrid Control Law
  • Let Q Joint -gt Tool Jacobian
  • Let S be a switching matrix, e.g. diag(0,1,1)
  • Velocity control u

Visual part
Force part
29
Accounting for Friction
  • Friction force is along motion direction!
  • Subtract out to recover surface normal

30
Motion Sequence
31
Motion Sequence
32
Summary of model estimation and visual motion
control
  • Model estimation is on-line and requires no
    special calibration movements
  • Resulting Jacobians both model/constrain the
    visual situation and provide visual motor transf.
  • Motion control is direct from image based error
    functions to motor control. No 3D world space.

33
2. How to specify a visual task sequence?
  1. Grasp
  2. Move in
  3. Cut
  • Grasp
  • Reach close
  • Align
  • Turn

34
Recall Parallel Composition Example
Visual error function spelled out
35
Serial CompositionSolving whole real tasks
  • Task primitive/link
  • Acceptable initial (visual) conditions
  • Visual or Motor constraints to be maintained
  • Final desired condition
  • Task

36
Natural primitive links
  • Transportation
  • Coarse primitive for large movements
  • lt 3DOF control of object centroid
  • Robust to disturbances
  • Fine Manipulation
  • For high precision control of both position and
    orientation
  • 6DOF control based on several object features

37
Example Pick and place type of movement
  • 3. Alignment???
  • To match transport final to fine manipulation
    initial conditions

38
More primitives
  • 4. Guarded move
  • Move along some direction until an external
    contraint (e.g. contact) is satisfied.
  • 5. Open loop movements
  • When object is obscured
  • Or ballistic fast movements
  • Note can be done based on previously estimated
    Jacobians

39
Solving the puzzle
40
Teaching and Programming in Visual Space
  • Tele Assistance
  • A tele-operator views the scene through stereo
    cameras
  • Objects to be manipulated are pointed out on-line
  • Visual Programming
  • Off-line
  • Like xfig, macpaint, but with a palette of motor
    actions.
  • Teaching by Showing
  • A (human) manipulation is tracked in visual space
  • The tracked data is used to (automatically?)
    generate a sequence of visual goals

41
HCI Direct manipulationExample xfig drawing
program
  • Icons afford use
  • Results visible
  • Direct spatial action-result mapping

matlab drawing
line(10, 20,30, 85) patch(35, 22,15, 35,
C) C complex structure text(70,30,'Kalle')
Potentially add font, size, etc
42
ExampleVisual programming
43
Task control summary
  • Servoing alone does not solve whole tasks
  • Parallel composition Stacking of visual
    constraints to be simultaneously satisfied
  • Serial composition Linking together several
    small movements into a chain of continuous
    movements
  • Vision-based user interface
  • Tele-assistance
  • Visual Programming
  • Teach by showing

44
Types of robotic systems
Preprogrammed systems
Autonomy
Programming by demonstration
Tele-assistance
Supervisory control
Generality
45
3. Software systems for vision-based control
46
Hand-EyeSystem
47
System requirements
  • Solve many very different motion tasks
  • Flexible, teachable/re-programmable
  • Real time
  • On special embedded computers or general
    workstations
  • Different special HW
  • Multiprocessors

48
Toolbox
49
System design
  • Interpreted scripting language gives
    flexibility
  • Compiled language needed for speed and HW
    interface.
  • Examples

Matlab
Haskell
PVM
Dyn linking (mex)
Greencard
C, C, fortran
C, C
50
Usage example
  • Specialize robot
  • projandwait(zero3,robotmovehill,A3D,WaitForHill
    )
  • Initialize goals and trackers
  • TrackCmd3D,N InitTrackers(1 1,0,1)
  • PU GetGoals(1 1,0,1)
  • Servo control
  • J3s LineMove(projandwait,TrackCmd3D,J3i,PU,Ndi
    ,err)

51
Software systems summary
  • Most current demos solve one specific movement
  • For solving many everyday tasks we need
    flexibility and reprogrammability
  • Compiled primitive visual trackng and
  • Interpreted scripting language
  • Higher order functions

52
Workshop conclusions
  • ?

53
Workshop conclusions
  • Sensing is unreliable and incomplete
  • Cant reliably build internal 3D world models,
    but can use the real world as an external
    reference.
  • A-priori object and world models uncommon in
    human environments
  • Estimate on-line and only whats needed.
  • Human users require human interaction techniques
  • Interacting by visual pointing and gestures is
    natural.

54
Action/Perception division in human and machine
hand-eye syst.
55
Open questions?
  • For particular tasks what are the most natural
    representations and frames?
  • Global convergence of arbitrarily composed visual
    error functions?
  • Robustness?
  • Interaction with other sensing modalities?

56
Feedback system
  • Fast internal feedback
  • Slower external trajectory corrections

57
Short and long control loops
58
Applications for vision in User Interfaces
  • Interaction with machines and robots
  • Service robotics
  • Surgical robots
  • Emergency response
  • Interaction with software
  • A store or museum information kiosk

59
Service robots
  • Mobile manipulators, semi-autonomous

DIST TU Berlin KAIST
60
TORSO with 2 WAMs
61
Service tasks
This is completely hardwired! Found no real task
on WWW
62
But
  • Maybe first applications in tasks humans cant do?

63
Why is humanlike robotics so hard to achieve?
  • See human task
  • Tracking motion, seeing gestures
  • Understand
  • Motion understanding Translate to correct
    reference frame
  • High level task understanding?
  • Do
  • Vision based control
Write a Comment
User Comments (0)
About PowerShow.com