Title: Control of Multiple Autonomous Robot Systems
1Control of Multiple Autonomous Robot Systems
GRASP Laboratoryhttp//www.cis.upenn.edu/mars
- Vijay Kumar
- Camillo Taylor
Aveek Das Guilherme Pereira John Spletzer
2Multiple Autonomous Robots
- Hybrid Systems Approach to Robot Software
- modes as behaviors
- composition of modes
- Cooperative Control of Multiple Robots
- cooperative manipulation
- formation control
- tracking
- pursuit
- Human interaction
- visualization
3Vision for Multi-Robot Teams
- Mobile platforms for deploying cameras into an
environment - The case for cameras
- Small
- Cheap
- Passive
- Low Power
- Uses for imagery
- Visualization of remote environments
- Obtaining information about targets
- Position, level of activity etc.
- Basis for convenient human robot interfaces
4Visualization of Remote Environments
- Registered omnidirectional images can be used to
visualize remote scenes
5Visualizing the scene
- Scene can be interactively explored and/or
revisited with a new camera trajectory specified
by the user
6GRASP Laboratory
7View Synthesis with Quasi-Sparse Correspondences
- Dense correspondences can be difficult to obtain
due to.. - Occluded regions
- Homogenous image regions
- Strategy
- Focus on accurately reproducing the motion of
edges in the scene - Use interpolation to estimate the motion of the
other points - Basis for visualization in MARS 2020
8Novel view movies
9Freespace Reasoning
- We can reason about the structure of space by
considering the union of the freespace volumes
induced by a collection of triangulated disparity
maps.
10Results with 3D reasoning
11Multi-Eyed Stereo Systems
- Locations of targets and objects in the
environment can be deduced from image
measurements acquired by the robots - The robot team can effectively be viewed as a
multi eyed stereo system
12Sensor Planning and Control
- Interesting property of these robot teams,
estimates for various parameters of interest are
obtained by combining measurements from multiple,
distributed sensors - We could choose to view our team as a multi-eyed
stereo system where the eyes can be moved - Question
- Given that the sensor platforms are mobile, how
should they be deployed in order to produce the
best estimates?
13Theoretical Framework
CR denotes the robot configuration space, and r
is an element of CR and denotes an element of
this configuration space CW denotes the feature
configuration space, and w is an element of CW
and denotes an element of this configuration
space denotes the measurements obtained by
the robot team
r x1, y1, q1, x2, y2, q2T
w xt, ytT
xt, ytT
a1, a2T
x2, y2, q2T
y
a2
q
x
14Optimization Problem
is a function that provides an estimate of
the feature state given the robots configuration
and the sensor measurements is a
function that returns the expected error between
the estimate returned by Est and the actual
feature state for a particular robot
configuration, r. This will depend upon our
model of sensor errors P(w) is a probability
density function on CW
Given this terminology, one can define a quality
function which reflects the
expected error in estimating the feature state
from a given robot configuration r
Objective
15Computational Approach
The optimization problem of minimizing Q(r) is
typically difficult to solve analytically
Particle Filtering approach Approximate P(w)
by a set (wj , pj), where wj is a single sample
from CW , and pj a weight reflecting the
probability of wj representing the state w. The
integral can then be approximated by a tractable
summation.
The resulting function is typically piecewise
continuous in r and can be optimized using
standard techniques
16Implementation Example
17Integrating Sensing and Control
- Piggyback on particle filtering approach for
sensing to obtain - the particle set ?i,?i
- Framework offers a complementary relationship
between - sensing and control
18Tracking Targets
19Tracking Targets contd
20Handling Obstacles
21Technology Transfer and Transition
- Robot hardware and software
- University of Colorado
- Oklahoma State University
- Georgia Tech
- Evolution Robotics
- Jim Ostrowski
- DoD Programs
- MARS Teams (2020)