Title: Vision-Guided Robot Position Control
1Vision-Guided Robot Position Control
Tony Baumgartner Brock Shepard Jeff Clements
Norm Pond Nicholas Vidovich
Advisors Dr. Juliet Hurtig Dr. J.D. Yoder
November 12, 2003
2Problem Identification
- Goal Develop a vision guided robot positioning
control system - Current systems are limited to Teach-Repeat style
- On completion the robot will automatically
perform tasks specified by the software
3Budget/Purchasing
- A budget of approximately 20,000 has been
provided by Ohio Northern University - Robot
- Gripper
- Desktop computer
- Two CCD (Charge-Coupled Device ) cameras
4Background Information
SCARA Selectively Compliant Assembly Robot Arm
Articulated
5Vision Systems
- Eye-On-Hand Configuration
- External Stereo Camera Configuration
6Prototype
- Final prototype will include
- CCD camera(s)
- Desktop computer
- Robot Control Unit
- Robot
- Gripper
7Specifications
- Robots are measured according to two
specifications - Repeatability
- Absolute Accuracy
8Object Recognition Algorithms
- Normalized Cross-Correlation
- Slower
- Less Sensitive To Light
- Shape-Based Matching
- Faster
- Similarity Measure
- The Sum of Absolute Differences
9Difference Algorithm Visuals
10Difference Algorithm Visuals
11Design Deliverables
- Purchasing Target Date
- Robot 09/12/03 - 11/21/03
- Gripper 09/12/03 - 11/21/03
- Image Processing
- Camera pan/tilt/zoom 09/30/03 - 02/20/04
- CAD 11/03/03 - 01/09/04
- Difference Algorithm 11/04/03 - 12/19/03
- Object Recognition 01/09/04 - 02/02/04
- User Interface 02/02/04 - 02/20/04
- General
- Controller Integration 12/01/03 - 02/02/04
- Gripper Implementation 12/01/03 - 02/02/04
- Testing 03/08/04 - 04/01/04
12Overall Block Diagram
13Software Block Diagram
14Robot Decision Matrix
0 Unacceptable 1 Acceptable 2 Average 3
Good 4 Excellent 5 Best Case
15QUESTIONS
?
16References
- lt1gt ABB Product Specification Sheet (2003)
- lt2gt Lin, C.T., Tsai D.M. (2002) Fast normalized
cross correlation for defect detection. Machine
Vision. Yuan-Ze University,1-5. - lt3gt Phil Baratti Robot Precision (personal
communication, November 4 - 2003)
- lt4gt Stegar, C.., 2001. Similarity measures for
occlusion, clutter, and - illumination invariant object recognition. In
B. Radig and S. Florczyk(eds), Mustererkennung
2001, Springer Munchen, pp. 145-154. - lt5gt Stegar, C., Ulrich, M. 2002 Performance
Evaluation of 2D Object - Recognition Techniques. Technical Report
Technische Universitat Munchen, 1-15. - lt6gt Robots and Manufacturing Automation, pg.
220-222. - lt7gt http//www.prip.tuwien.ac.at/Research/RobotVis
ion/vs.html
17Software Flow Description
- Take Initial Picture
- Initialize cameras
- Capture image
- Store image to hard disk or in RAM (will depend
on speed) - Take Current Picture
- Start current image capture loop
- Capture image and store in memory
18Software Flow Description
- Compare Current Picture With Initial
- Use difference algorithm to compare initial image
and current image this yields a bitmap of the
pixels that have changed based on a specified
error factor - Store difference image in memory
19Software Flow Description
- Ready to Manipulate Object?
- Check difference image to see that an object has
entered the area using Boolean to keep track - If object is already present check to see if it
has stopped moving - If done moving, act on object, else return to
Take Current Picture Block
20Software Flow Description
- Identify and Find Object Coordinates
- Take pictures of scene using both cameras focused
on object - Compare the images received from the cameras
using the Stereo Vision Algorithm to determine
object type and relevant points relative to the
position of the robot arm - Send Coordinates Instructions to Robot
Controller - Communicate with robot controller sending
incremental instructions based on object type and
coordinates
21Software Flow Description
- Use Stereo Vision Algorithm to make sure desired
actions - Take pictures of scene using both cameras focused
on object - Compare the images received from the cameras
using the Stereo Vision Algorithm to determine
whether the robot arm has moved correctly. After
adjusting to mistakes by the robot, send more
coordinates and instructions to robot controller
until it has performed the correct task