Title: Adhoc Deployable FineGrained Localization
1Ad-hoc DeployableFine-Grained Localization
AND
Lewis Girod and Deborah Estrin Laboratory for
Embedded Collaborative Systems --
http//lecs.cs.ucla.edu
Goal Form a local coordinate system with sub-cm
accuracy...
Collaborative Sensing, Multisensor Integration
Energy-Efficient Geographical Routing
that is ad-hoc deployable and self-calibrating
...
Ad-hoc Deployable Deployable using at most
rules of thumb to guide sensor placement. In
order to scale to dense installations, deployment
must be fast and easy. For truly ad-hoc
deployment (e.g. air-drop), compensation though
high densities.
Self-Calibrating Self-calibrating and
self-configuring, using active calibration
techniques. In order to scale to large, dense
networks, the system must calibrate and configure
without user involvement, and without global
coordination.
Energy, Expense, Form Factor Fits into context
of low-power wireless sensor nodes.
Communications consumes energy, so local
coordination is preferable. Should be small and
cheap enough to be deployed densely.
Environment Independent Resilient to varied and
dynamic environmental conditions foliage,
clutter, and interference from noise and
reflections, both indoors and outdoors.
Local Coordination Algorithms Make decisions
locally, without depending on global state.
Leverage high sensor densities to do local
cross-validation rather than global optimizations
through cross-validation and integration of
different sensor modalities.
- Future Work Multimodal Localization
- Any single mode of sensing can suffer from
undetectable and often consistent errors - GPS obstructed sky, RF jamming, consistent
reflections - Acoustic detection of reflected paths in NLOS
case - Stereo vision visual ambiguities (wallpaper
illusion) - Use cross-validation across sensory modes to
locally eliminate erroneous data - Cross validation uses local algorithms, reducing
communications overhead. - Ambiguities are resolved more quickly
- Use of different modes of sensing reduces
probability of correlated ambiguities
- First Results Acoustic Ranging
- Cooperative acoustic ranging
- Audible wideband ranging signal (511 bit
M-sequence) - Detected by sliding correlator measure time of
flight - Accurate ranging in line-of-sight conditions
- Sub-cm accuracy, error independent of range
- Resilient to multipath interference and ambient
noise - Obstructions to line-of-sight result in errors
- Consistent detection of reflected paths
- Non-gaussian and multimodal distributions
- Reflection errors not always detectable
statistically
Figure (D) shows cameras and LEDs combined with
acoustic ranging. Acoustic ranging is used to
discover all pairs ranges (shown as arrows). The
camera sees the LEDs (red dots), and combines red
acoustic range data with image data to form a
depth map. LEDs correlated to nodes through coded
patterns. Angular offsets can be used to
cross-validate green acoustic ranges. Visible
LEDs may be suggestive of LOS, and thus an
accurate acoustic range. Two cameras can use
LEDs to solve the correspondence problem,
simplifying stereopsis.
(D)
(C)
- Correlator detects first peak
- Ranging experiment shows linearity
- System diagram of rangefinder
Radio
Radio
CPU
CPU
Speaker (Kingstate KDS-27008)
Microphone
Ref Girod and Estrin, Robust Range Estimation
Using Acoustic and Multimodal Sensing, in
submission to Proc. 2001 Intl Conference on
Intelligent Robots and Systems (IROS), October
2001, Maui, HA, USA.