Francisco Blanes, Gins Benet, Pascual Prez ,Jos E' Sim - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Francisco Blanes, Gins Benet, Pascual Prez ,Jos E' Sim

Description:

Francisco Blanes, Gins Benet, Pascual Prez ,Jos E' Sim – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 23
Provided by: DIS48
Category:
Tags: benet | blanes | francisco | gins | jos | moos | pascual | prez | sim

less

Transcript and Presenter's Notes

Title: Francisco Blanes, Gins Benet, Pascual Prez ,Jos E' Sim


1
MAP BUILDING IN AN AUTONOMOUS ROBOT USING
INFRARED SENSORS
  • Francisco Blanes, Ginés Benet, Pascual Pérez
    ,José E. Simó
  • Dep. Informática de Sistemas y Computadores.
    Universidad Politécnica de Valencia.
  • Apdo. 22012, 46022 Valencia, Spain
  • pblanes, gbenet, jsimo, pperez_at_disca.upv.es
  • Dep. de Informática de Sistemas y Computadores.
  • Universidad Politécnica de Valencia (Spain).

2
Project description
Mobile robot prototype
TAP97-1164-C03-03 (Mobile robot sensing)
Intelligent sensing Data fusion O.S. Kernel Real
time operation Reactive response
GV-C-CN-05-058-96 (Distributed systems)
TAP98-0333-C03-02 (Mobile robot sensing)
  • Group background
  • Real-time
  • AI
  • Intelligent control

Architecture definition
Objectives
To provide a general platform for research in
distributed architectures for real-time control
and mobile computing.
The platform is focused to operate in industrial
environments in a coordinated, controlled and
supervised autonomous way.
3
Prototype
4
Architecture
5
Main robot controller
  • Embedded PC-board
  • Network interface by radio (2 Mb/s)
  • CAN bus manager unit
  • Supervises and controls the whole robot.
  • Operating systems RT-LINUX or Windows NT.
  • Software
  • Data sensor fusion and world modeling tasks

6
Motion controller
  • 80C592-based board
  • Controls two independent wheels that rotate
    around the same axis
  • Optical encoders are used to obtain data about
    the estimated position of the robot (odometry).
  • Two HCTL1100 circuits manage the power drivers
    for each motor.
  • Speed control (PID)
  • Position control (trapezoidal)
  • Different modes of operation (Speed control,
    position control and autonomous trajectory
    tracking).

HCTL1100 Board
LMD18200T Driver
80C592 board
motor
Internal Bus
motor
HCTL1100 Board
LMD18200T Driver
CAN Bus
7
Sonar System
  • 80C592 based board
  • 2 x 8-bit A/D converter
  • CAN bus subsystem
  • Two sonar heads with different
  • main lobe width.
  • Controls the angular position of a rotating
  • transducer by means of a stepper motor.
  • Generates the ultrasonic waveforms to be supplied
    to the transducer.
  • 4 Mb memory to process the echoes received from
    surrounding objects.
  • Different modes of operation
  • Distance measurement
  • Local map building.
  • Communicate the raw data to main processor.

10º
30º
Sonar module prototype
8
IR Sensors
  • IR ring with 16 sensors
  • Each sensor is composed with two LED and one
    photodiode
  • The sensors are grouped in bundles of two sensors
  • Only 16 milliseconds for a complete sequential
    scan

9
IR Sensors
  • Sensor control based on 80C592 micro-controller
  • Data sent trough CAN Bus in 8 bit format (4
    sensors in each message) or 10 bit (1 sensor in
    each message)

10
Model of the IR sensor
The sensor output can be modelled using the
following equation, as a function of the distance
x and the incidence angle ? with the target
surface
  • ? and ? are the parameters of the model.
  • The parameter? includes the radiant intensity
    of the IR emitters, the spectral sensitivity of
    the photodiode, the gain of the amplifier and the
    reflectance coefficient of the target ? i
  • The parameter ? is the offset due to the ambient
    light
  • ?o is constant, and ? can be obtained before
    every reading.
  • Thus, the only parameter that characterizes an
    object is its reflectance coefficient ? i

Plot of the model for different values of ? i.
Normal values of ? i.
11
Model of the IR sensor
Response to different Canson coloured cards in
controlled tests
  • The response depends mainly on texture of
    material rather than on colour
  • Tests performed in the real environment showed
    typical values of ?i varying from 0.7 to 0.9

12
Incidence angle estimation
  • Incidence angle is unknown a priori.
  • If zero incidence angle is assumed, readings are
    oversestimated by a factor of cos ? .
  • The value of ? can be obtained from the readings
    of a pair of transducers, as depicted in the
    figure.
  • From the figure, the apparent angle ? can be
    obtained as
  • tan ? (d1-d2)/L
  • The exact value of ? can be derived solving this
    equation
  • Instead, an approximated value can be obtained
    using

13
Distance treatment
  • Distance estimation D could be used for grid map
    building but ...
  • The estimation suffers uncertainty because colour
    and texture of the target
  • We can model this uncertainty with a range of ?i
    values
  • Common objets in the robot environment vary from
    an ?imax0.9 to ?imin0.7 (data obtained from
    tests)

14
Distance treatment
  • The ?max produces a dmin estimation of the
    distance, ?min a dmax and hence a mean distance
    between them
  • These distances define a sector of occupied cells

15
Sensor fusion
  • Distance estimations are fused in pairs from the
    same couple
  • Distance measurements are valid if the the
    incidence angle ? is between -45º,45º, over
    these values the error in the angle estimation
    implies errors bigger than 1.8cm
  • If the distance value is valid, then dmax and
    dmin define a sector with a set of cells which
    occupancy value are calculated linearly

16
Map Building
  • Local values of the cone are fused with the
    previous map ones
  • A Bayesian approach has been used in this fusion
    process

p_occ_new
17
Tests and Results
  • Tests were conduced in a real environment
    composed of brick walls, wood doors and metal
    doors.
  • The goal of tests was to locate these objects in
    the scene
  • Control system is based on threads and CAN
    communication to accomplish real-time
    restrictions
  • The acquisition and fusion loop has a period of
    150 milliseconds
  • Average speed of the robot is around 0.25m/s

18
Tests and Results
  • Test 1 short corridor with metal cylinders

Scenario 1
cylinders
Resulting grid map 20x20 mm cell size
19
Tests and Results
  • Test 2 Long corridor with doors

Odometric error 90º angle
Scenario 2
Resulting grid map 40x40 mm cell size
20
Tests and Results
  • Test 3 end corridor with autonomous robot
    exploration

Resulting grid maps 20x20 mm cell size
75cm IR range
55cm IR range
21
Future Work
  • New robot version
  • Improvement of sensor configuration distance
    between sensor has been increased, thus a better
    incidence angle estimation can be obtained
  • Ultrasonic and infrared sensor fusion could be
    used to improve distance estimation and to obtain
    the reflectance value

22
Conclusions
  • A new IR sensor with large range has been
    presented
  • The influence of the angle of incidence in the
    measurement is solved using pairs of estimations
  • The uncertainty inherent to the colour and
    texture of objets is managed during the sensor
    fusion
  • Good quality grid maps have been obtained using
    the sensor fusion approach
Write a Comment
User Comments (0)
About PowerShow.com