Robust Range Estimation Using Acoustic and Multimodal Sensing - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Robust Range Estimation Using Acoustic and Multimodal Sensing

Description:

Fine grained localization enables many applications for sensor networks. Routing ... Fine-grained, ad-hoc deployable, cooperative localization ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 21
Provided by: csci6
Category:

less

Transcript and Presenter's Notes

Title: Robust Range Estimation Using Acoustic and Multimodal Sensing


1
Robust Range Estimation Using Acoustic and
Multimodal Sensing
  • Lewis Girod and Deborah Estrin
  • 24 April 2001

2
Goal A Fine-grained, Ad-hoc Deployed
Localization System
  • Sub-cm scale localization that is
  • Independent of environment
  • (foliage, clutter, obstructed sky)
  • Self-configuring/self-calibrating
  • (active calibration)
  • Minimal requirements on deployment
  • (at most a few rules of thumb)
  • Implemented by low-power wireless nodes
  • (small, inexpensive, distributed)

3
Why?
  • Fine grained localization enables many
    applications for sensor networks
  • Routing
  • Energy expenditure is a function of path
  • Location is a natural namespace for physically
    motivated applications
  • Collaborative Sensing
  • Location is a natural context for defining
    relations among sensors
  • e.g. relative position of two microphones and a
    camera
  • UI
  • Define task in context of location
  • Physical UI, e.g. pointing, moving objects

4
Motivating Example Habitat Monitoring
  • Wildlife habitat
  • Instrumented with cameras and microphones
  • Task is to detect presence of bird and photograph
    it
  • One approach
  • Use microphones to detect birdcall and estimate
    location
  • Then, select a camera that has the bird in field
    of view

5
Habitat Monitoring contd
  • Problem Starting with no knowledge, acoustically
    localizing the bird is very difficult
  • Simultaneously estimate sensor positions, bird
    position and source signal
  • Assuming its possible at all, convergence will
    be slow
  • More tractable if we start with knowledge of the
    positions and orientations of the sensors BUT
  • We dont want to measure positions
  • Want to be resilient to sensors moving
    occasionally

6
Habitat Monitoring contd
  • Fine-grained, ad-hoc deployable, cooperative
    localization
  • Sensors determine relative positions during
    initialization phase
  • Periodic verification accounts for environmental
    changes
  • Enables sensor collaboration
  • Acoustic sensors (synchronized by radio)
    collaborate to estimate the birds position
  • Position identified as being in FOV of a camera,
    which can then capture an image.

7
Our focus
  • Thesis
  • We are validating this idea through
    implementation
  • Building a testbed for multimodal localization
    indoors
  • Experimenting with acoustics and cameras

Any individual mode of sensing used for
localization (e.g. acoustic) will suffer from
unrecoverable ambiguities and undetectable
errors. In many cases, these errors are
persistent and are not readily eliminated
statistically. A more robust solution to this
problem is to cross-validate sensor data with
data gathered from alternate perspectives and
from other modes of sensing.
8
Acoustic Ranging
  • Our initial experiments have focused on
    developing an active acoustic ranging system
    based on measurement of time of flight of sound.

Radio
Radio
CPU
CPU
Speaker (Kingstate KDS-27008)
Microphone
9
Signaling and Detection
  • Wideband ranging signal
  • 511 bit M-sequence
  • Modulated using BPSK, 12 kHz
  • Detected by matched filter
  • Earliest peak in output of sliding correlator
  • M-sequences autocorrelation properties result in
    good process gain

10
Detection Algorithms
  • Earliest peak is most correct Secondary peaks
    represent echoes
  • Need to estimate noise level in order to
    correctly identify early peak
  • Noise level changes dynamically over the duration
    of the measurement

11
Initial Experiments
  • We performed some initial experiments
  • Indoors, in our lab
  • No special effort to prevent environmental noise
    or multipath interference
  • For each experiment, statistical analysis of 20
    trials
  • Experimented with
  • Measurement of distances from 0-8m
  • Temperature / Humidity dependence
  • Dependence on orientation or speaker/microphone
  • Interference from various obstructions

12
Initial Results
  • When the system works, it works well
  • Sub-cm ranging accuracy
  • (Averaged data after removing outliers)
  • No significant error as a function of distance up
    to 10m

13
Fixing minor problems
  • In this process, we added numerous algorithms and
    fixes to improve the performance of the estimator
  • Reasonable results can be achieved by optimizing
    the performance of the acoustic system
  • However, no amount of optimization solves all the
    problems

14
Problems Temp/Humidity
  • Variations in speed of sound
  • Problem Dependence on temperature and humidity
  • Solution model speed of sound based on average
    sensor readings from temp/humidity sensors
  • Local variations (lt sensor granularity)
  • Problem variations over short distances (e.g.
    sunlight heating surface)
  • Solution long-term averaging

15
Problems Orientation
  • Orientation dependence
  • Problem When speaker or microphone point away
    from line-of-sight, sound diffracts around edges,
    resulting in several cm of error.
  • Solution Two speakers back to back and two
    microphones back to back. The microphones are
    input through the stereo line-in port and some
    orientation information can be recovered by
    comparing phase.

16
Problems Obstructions
  • Obstructions to the line-of-sight
  • Problem Obstructions can reliably prevent
    detection of the line-of-sight path
  • Solution Under some circumstances, obstructed
    conditions can be detected
  • Multimodal distributions in successive estimates
  • Failure of the triangle inequality

Transient detection failure leads to multimodal
distribution
Failure of the triangle inequality
17
Obstructed LOS not always locally detectable
  • Obstructed LOS can introduce ambiguities that are
    not always locally resolvable based on acoustic
    range data
  • For example, reflections can cause ambiguity.
    The simplest explanation (blue nodes), is not the
    true explanation (green nodes).
  • To solve this kind of problem, alternative
    hypotheses must be formulated.
  • These hypotheses can have non-local effects (BAD
    for scaling)

18
Cross-validation
  • Cross-validation across sensor modes
  • Simpler, local algorithms
  • Less communication overhead
  • Resolves ambiguities faster, more certainty
  • Example Cameras and IR LEDs
  • Camera sees LEDs (orange dots)
  • LEDs emit ID coded pulses
  • Acoustic ranges estimated between all components

19
Rich set of cross-checks
  • Provides rich set of cross-validation
    opportunities
  • Any LED seen by the camera is known to be LOS,
    therefore acoustic range between camera and that
    node is true
  • LED images have known range (from acoustics)
    therefore depth map from only one camera
  • With a depth map, the distance between two
    visible LEDs can be approximately determined,
    enabling further validation
  • Stereopsis is simplified, because the coded LED
    pulses solves the correspondence problem

20
Conclusions and Future Work
  • Conclusion Lots of future work
  • Complete implementation of dual-microphone
    orientation estimation upgrade testbed
  • Multilateration experiments with upgraded testbed
  • Camera LED system characterization integration
Write a Comment
User Comments (0)
About PowerShow.com