Title: HumanintheLoop Control of an Assistive Robot Arm
1Human-in-the-Loop Control of an Assistive Robot
Arm
Katherine Tsui and Holly Yanco University of
Massachusetts, Lowell
2Challenge
Using a standard controller, put the ball in the
cup. 1. Turn the cup over. 2. Pick up the
ball. 3. Put the ball in the cup. Average time
of execution 3-5 minutes, even by middle school
children with video game experience! Although
this is a simplified example, similar tasks may
occur repeated throughout a handicapped persons
daily life.
3Motivation
- Why?
- Unintuitive controllers
- Operator sensory overload
- For severely handicapped people, activities of
daily life are difficult enough to perform. - While an assitive robotic device like ours allows
for limited independence, it can be frustrating
and tiresome to operate. - Lets abstract this away
4Hardware
- Manus Assistive Robotic Manipulator (ARM) by
Exact Dynamics - 6 Degree of Freedom (DoF)
- plus 2 DoF gripper end-effector
- Joint encoders
- Cameras
- Shoulder view
- Gripper view
5Standard Control
Movement for a out of the box configuration is
done by menus accessed from single switch,
keypad, or joystick input.
6Standard Control Using the Joint Menu
Direct Joint Mode Direct control of
individual joints - Unable to temporally
simultaneously move joints - Not how
humans do it we dont think in terms like move
shoulder up, rotate wrist, extend forearm, etc.
7Standard ControlUsing the Cartesian Menu
Direct Cartesian Mode Gripper moves
linearly in 3D Joints can move
collaterally in space and time - Still not
how humans do it we dont think in term of
moving left, right, up, down, etc.
8Alternative Control
- Transparent Mode
- ARM has Controller Area Network (CAN)
communication with PC - ARM transmits status packages at 20ms intervals.
- t20ms message 0x350 gives ARM status and
position - t40ms message 0x360 gives gripper position
- t60ms message 0x37F asks for return package
- t80ms message 0x350
- Every 60ms, when message 0x37F is sent, movement
information can be returned as ARM input.
9How should the ARM move?
Like humans do! Think I want the cell
phone. Actions See, reach, grasp However, the
intended users may not be capable of these
actions, therefore we simplify.
10Selection Process
Given what the user sees directly ahead of them,
and assuming the desired, unobstructed object is
within reach zoom in on the cell phone!
11Movement
- From the user selection, we know the x,y position
of where the ARM should be. - How do we move there?
- By using Phission and joint encoder feedback to
determine movement length, speed, and direction - Phission Weve trained on the color we desire
to track. While the center of the blob is not
near the desired (x,y), move towards. - Feedback Monitor ARM status and position
12Drop for Z
Depth information is deduced through simulated
stereo vision. 2 images are sequentially taken as
the gripper moves along the y-axis B is
known. Disparity between the images yields depth
Z and the ARM moves close to the desired object.
Z Bf/(xL xR)
13Future Work
- Distance sensing laser
- Non-rigid stereo vision