Robot Vision - PowerPoint PPT Presentation

About This Presentation
Title:

Robot Vision

Description:

Robot Vision Today: Reactive Control & Vision Next Time: Localization & Navigation (or Where am I and How to get there? Camera-carrying robot enters Great Pyramid – PowerPoint PPT presentation

Number of Views:121
Avg rating:3.0/5.0
Slides: 37
Provided by: wein151
Category:
Tags: robot | vision | works

less

Transcript and Presenter's Notes

Title: Robot Vision


1
Robot Vision
  • Today Reactive Control Vision
  • Next Time Localization Navigation
  • (or Where am I and How to get there?

Camera-carrying robot enters Great Pyramid
2
Spectrum of Control
Teleoperation Human Control
Autonomous (AI) Control
Shared Human Robot Control
Remote-Controlled Rats
3
Reactive/Behavior-Based Control
  • Ignores world models
  • The world is its own best model
  • Tightly couples perceptions to actions
  • No intervening abstract representations
  • Primitive Behaviors are used as building blocks
  • Individual behaviors can be made up of primitive
    behaviors
  • Reactive no memory
  • Behavior-Based Short Term Memory (STM)

4
Reactive/Behavior-Based Control Design
  • Design Considerations
  • What are the primitive behaviors?
  • What are the individual behaviors?
  • Individual behaviors can be made up of primitive
    and other individual behaviors
  • How are behaviors grounded to sensors and
    actuators?
  • How are these behaviors effectively coordinated?
  • If more than one behavior is appropriate for the
    situation, how does the robot choose which to
    take?

5
Design for robot soccer
  • What primitive behaviors would you program?
  • What individual behaviors?
  • What situations does the robot need to recognize
  • If the pass behavior is active and the shoot
    behavior is active, how does it choose?

6
Situated Activity Design
  • Robot actions are based on the situations in
    which it finds itself
  • Robot perception is characterized by recognizing
    what situations it is in and choosing an
    appropriate action

7
Implementing Behaviors
  • Schema knowledge process
  • Perceptual Schema interpretation of sensory data
  • Releasers instantiates motor schema
  • Motor Schema actions to take.

8
Schema for Toad Feeding Behavior
9
Design of Behaviors represented by a State
Transition Table
q ? K Set of states (behaviors)
s ? S Set of releasers
d Transition function
s State Robot starts in
q ? F Set of terminating states
Trash Pick-up Example
10
Visual Representation in a Finite State Automata
11
Cooperative Coordination
  • Behavioral Fusion
  • Requires the ability to concurrently use the
    output of more than one behavior at a time
  • Consider what happens when a toad sees two flies

Behavior Fusion via vector summation
12
Competitive Coordination
  • Action Selection Method
  • Behaviors compete using an activation level
  • The response associated with the behavior with
    the highest activation level wins
  • Activation level is determined by attention
    (sensors) and intention (goals)

13
Competitive Coordination
  • Suppression Network Method
  • Response is determined by a fixed prioritization
    in which a strict behavioral dominance hierarchy
    exists.
  • Higher priority behaviors can inhibit or suppress
    lower priority behaviors.

14
Subsumption Architecture
  • A suppression network architecture built in
    layers
  • Each layer gives the system a set of pre-wired
    behaviors
  • Layers reflect a hierarchy of intelligence.
  • Lower layers are basic survival functions
    (obstacle avoidance)
  • Higher layers are more goal directed (navigation)
  • The layers operate asynchronously (Multi-tasking)
  • Lower layers can override the output from
    behaviors in the next higher level
  • Rank ordering

15
Foraging Example
16
More Complex ExampleRobot Follow a Corridor
17
Using Multiple Behaviors can require the Robot to
Multi-task
  • Multi-tasking is having more than one computing
    processing run in parallel.
  • True parallel processing requires multiple CPUs.
  • IC functions can be run as processes operating in
    parallel The computer processor is actually
    shared among the active processes
  • main is always an active process
  • Each process, in turn, gets a slice of processing
    time (5ms)
  • Each process gets its own default program stack
    of 256bytes
  • A process, once started, continues until it has
    received enough processing time to finish (or
    until it is killed by another process)
  • Global variables are used for inter-process
    communications

18
IC Functions vs. Processes
  • Functions are called sequentially
  • Processes can be run simultaneously
  • start_process(function-call)
  • returns a process-id
  • processes halt when function exits or parent
    process exits
  • processes can be halted by using
  • kill_process(process_id)
  • hog_processor() allows a process to take over
    the CPU for an additional 250 milliseconds,
    cancelled only if the process finishes or defers
  • defer() causes process to give up the rest of
    its time slice until next time
  • More info http//www.newtonlabs.com/ic/ic_11.html
    SEC77

19
Example.ic
  • The robot looks left and right
  • If it sees RED to one side it turns to face it
  • If it sees RED ahead it beeps
  • If the stop button is pressed it plays a song and
    quits

20
Reactive Good Bad
  • Works with the Open World Assumption
  • Provides a timely response in a dynamic
    environment where the environment is difficult to
    characterize and contains a lot of uncertainty.
  • Unpredictable
  • Low level intelligence
  • Cannot manage tasks that require LTM or planning
  • Tasks requiring localization and order dependent
    steps

21
Computer Vision
  • Uses the electromagnetic spectrum to produce an
    image.
  • Visible light, x-rays, thermal, infrared

22
Representation
  • Image picture like format where there is a
    direct physical correspondence to the scene being
    viewed.
  • Implies there are multiple readings in a grid
  • Pixels picture element
  • Measure depends on the type of spectra being used
  • Image function converts a signal to a pixel value

23
CCD Cameras
  • Charged-Couple device detects visible light
  • Light fall on an array of metal-oxide
    semiconductor capacitor (MOS)
  • Line transfer or frame transfer
  • A/D conversion
  • Slow frame rate
  • Frame buffers
  • Framegrabber

24
Representations
  • Grayscale
  • 8-bit
  • 256 discrete gray values
  • 0 black, 255 white

25
Representations
  • Different method for representing color
  • RGB space red, green, blue
  • HSI space hue, saturation, and intensity
  • Hue is the dominant wavelength
  • Saturation is the lack of whiteness
  • Intensity is the quantity of light
  • Linear transformation between RGB and HSI

26
Comparison of Region Segmentation
27
RGB
  • Color Space
  • 24 bit color (8 bits per color)

28
RGB Representations
  • Interleaved
  • RGB values stored together
  • Red imagerowcol0
  • Green imagerowcol1
  • Blue imagerowcol2
  • Separate
  • RGB values stored as separate 2D arrays
  • Red image_redrowcol
  • Green image_greenrowcol
  • Blue image_Bluerowcol

29
Region Segmentation
  • Identifying a region in an image with a
    particular color
  • Thresholding
  • Binary image

30
Region Segmentation
31
cmucamlib Routines
  • To use CMU camera routines put
  • use "cmucamlib.ic" at the top of your file
  • Call init_camera() to initialize camera before
    any other camera calls -- will beep and complain
    if HB cannot talk to camera (check the dongle
    switch)
  • Use clamp_camera_yuv() to automatically set
    camera for the current lighting conditions.
    Camera should be pointed at a white surface when
    this call is being made (it waits for start
    button to be pressed). It takes 15 seconds for
    this function to complete!

32
cmucamlib Routines (cont.)
  • Call track_blue() track_orange() to check
    for color blobs that CMUcam can see. These
    functions return 0 if they find no color blob, or
    the confidence of the blob detected. A good
    confidence is 80 and up. A confidence of 4 or 5
    is poor.

33
cmucamlib Routines (cont.)
  • The track_color information is stored in globals
  • track_size stores the approximate number of
    pixels matching in the blob
  • track_x stores the pixel x coordinate of the
    centroid of the color blob
  • track_y stores the pixel y coordinate of the
    color blob (note 0,0 is the center 40,80 is
    upper right and
  • -40,-80 is lower left)
  • track_area stores the size of the bounding
    rectangle of the color blob
  • track_confidence stores the confidence for seeing
    the blob

34
cmucamlib Routines (cont.)
  • For experts use trackRaw() to specify a
    particular color for tracking. Returns 0 if no
    such blob is found, -1 is there is a
    communication error, or the confidence. Also
    check out setWin()
  • More details and low level functions are given in
    the comments at the beginning of cmucamlib.ic and
    cmucam3.ic

35
Example cmucamlib-demo.ic
/ demonstrate color blob sensing for poof balls
and blue paper / use "cmucamlib.ic" void
main() init_camera() // initialize the
camera in YUV mode clamp_camera_yuv() //
clamp camera white balance in YUV mode
while(!stop_button()) // hold down Stop for a
long time if (track_blue() gt 4) // you
could make this 0 bigger
// number, like 80 for example
printf("blue foundd\n", track_confidence)
else if (track_orange() gt 4)
printf("orange foundd\n",
track_confidence) else
beep() printf("nothing...\n")
// end while() // end
main()
36
CMUcamGUI
Write a Comment
User Comments (0)
About PowerShow.com