Radar - PowerPoint PPT Presentation

1 / 71
About This Presentation
Title:

Radar

Description:

Radar – PowerPoint PPT presentation

Number of Views:240
Avg rating:3.0/5.0
Slides: 72
Provided by: University514
Category:
Tags: lidar | radar

less

Transcript and Presenter's Notes

Title: Radar


1
LIDAR (Light Detection and Ranging)
Dr. John R. Jensen Department of
Geography University of South Carolina Columbia,
SC 29208
Jensen, 2008
2
Digital Elevation Models
  • A digital elevation model (DEM) is defined as a
    file or database containing elevation points over
    a contiguous area. DEMs may be subdivided into
  • digital surface models (DSM) that contain
    elevation information about all features in the
    landscape, such as vegetation, buildings, and
    other structures and
  • digital terrain models (DTM) that contain
    elevation information about the bare-Earth
    surface without the influence of vegetation or
    man-made structures.

Jensen, 2008
3
Sources of Digital Elevation Models
  • Four major technologies are used to obtain
    elevation information, including
  • in situ surveying
  • photogrammetry (Chapter 6)
  • Interferometric Synthetic Aperture Radar
    (IFSAR), (Chapter 9) and
  • Light Detection and Ranging (LIDAR).

Jensen, 2008
4
Sources of Digital Elevation Models
In situ surveying using conventional surveying
(e.g., total station) or GPS instruments can
yield accurate x,y,z information. However, field
surveys are time-consuming and expensive on a
per-point basis. Even with GPS, it is often
difficult for surveyors to obtain points in thick
undergrowth. Due to these obstacles, the density
of x,y,z observations obtained in an area is
sometimes low. It then becomes necessary to
interpolate between distant points to produce a
digital terrain model (DTM) of the area.
Jensen, 2008
5
Sources of Digital Elevation Models
Photogrammetric techniques are routinely used to
collect x,y,z topographic information.
Photogrammetric surveys can map large areas
during the leaf-off season and yield a dense
collection of points. Chapter 6 summarized how
digital elevation models are extracted directly
from stereoscopic aerial photography.
Photogrammetry can be used to obtain dense
elevation information in inhospitable terrain. In
addition, an analyst can selectively obtain a
greater density of x,y,z observations associated
with critical features such as ridgelines or
steep changes of slope (e.g., along road
overpasses) called breaklines.
Jensen, 2008
6
Sources of Digital Elevation Models
LIDAR offers an alternative to in situ field
surveying and photogrammetric mapping techniques
for the collection of elevation data. LIDAR
technology can be used to provide elevation data
that is accurate, timely, and increasingly
affordable in inhospitable terrain. LIDAR does
not, however, allow the analyst to control the
placement of individual x,y,z measurements on
ridgelines or breaklines.
Jensen, 2008
7
LIDAR History
The first optical laser was developed in 1960 by
Hughes Aircraft, Inc. Laser instruments were soon
used to compute distance by measuring the travel
time of light from a laser transmitter to a
target and then back to a laser receiver. Early
remote sensing LIDAR systems could only collect
measurements directly underneath the aircraft,
creating a single profile of elevation
measurements across the landscape (e.g., Jensen
et al., 1987). The synergistic use of kinematic
GPS and inertial measurement units (IMUs) on
airborne LIDAR scanning systems has allowed the
technology to mature rapidly. LIDARderived
horizontal and vertical accuracies and cost of
operation are now similar to that of
photogrammetry.
Jensen, 2008
8
LIDAR Laser and Scanning System
The LIDAR instrument consists of a system
controller and a transmitter and receiver. As the
aircraft moves forward along the line-of-flight,
a scanning mirror directs pulses of laser light
across-track perpendicular to the line-of-flight.
Jensen, 2008
9
(No Transcript)
10
LIDAR Data Collection
11
LIDAR Laser and Scanning System
  • LIDAR systems used for topographic mapping use
    eye-safe near-infrared laser light in the region
    from 1040 to 1060 nm.
  • Blue-green lasers centered at approximately 532
    nm are used for bathymetric mapping due to their
    water penetration capability.
  • LIDAR data can be collected at night if necessary
    because it is an active system, not dependent on
    passive solar illumination.

Jensen, 2008
12
LIDAR Laser and Scanning System
LIDAR systems can emit pulses at rates gt100,000
pulses per second referred to as pulse
repetition frequency. A pulse of laser light
travels at c, the speed of light (3 x 108 m s-1).
LIDAR technology is based on the accurate
measurement of the laser pulse travel time from
the transmitter to the target and back to the
receiver. The traveling time of a pulse of light,
t, is where R is the range (distance)
between the LIDAR sensor and the object. The
range, R can be determined by rearranging the
equation
Jensen, 2008
13
LIDAR Laser and Scanning System
The range measurement process results in the
collection of elevation data points (commonly
referred to as masspoints) arranged
systematically in time across the flightline. The
example displays masspoints associated with the
ground, several powerlines, a pole, and tree
canopy.
Jensen, 2008
14
Masspoints Used to Create LIDAR-derived IDW Bare
Earth DEM
The equivalent of locating 75,000 surveyors in
the field per second.
15
LIDAR Laser and Scanning System
The laser footprint is approximately circular on
the ground and varies with the scan angle and the
topography encountered. The diameter of the
instantaneous laser footprint (Fpinst) on the
ground is computed by where h is the altitude
of the aircraft AGL, qinst is the instantaneous
scan angle under investigation, and g is the
divergence of the laser beam. For example, if h
750 m AGL, qinst 15, and g 1 mrad, then
Fpinst 0.80 m for flat terrain. At a flying
height of 1000 m AGL, a typical laser beam
divergence of 1 mrad results in a 1 m diameter
footprint.
Jensen, 2008
16
LIDAR Laser and Scanning System
The across-track swath width (sw) is given
by where q is the scan angle. For example, if
h 750 m AGL, and q 30, then sw 402 m.
Jensen, 2008
17
LIDAR Laser and Scanning System
The point spacing across-track (pspacing) is
dependent upon the pulse repetition frequency
(PRF), the altitude of the aircraft (h) AGL, the
instantaneous angular scanning speed (ainst) in
radians per second, and the instantaneous scan
angle (qinst) Actual sampling densities for
an area also depend on the forward speed of the
aircraft. Typical sampling densities on the
ground range between 1 point per 20 m2 to 20
points per m2 for a flying height of 1000 m.
These higher densities are typically achieved
using multiple overlapping flightlines to cover
the study area. The observed ground returns are
almost always less than the number of emitted
pulses.
Jensen, 2008
18
LIDAR Laser and Scanning System
The maximum off-nadir scan angle can be adjusted
to meet the needs of a data-collection mission.
The greater the scan angle off-nadir, the more
vegetation that will have to be penetrated to
receive a pulse from the ground assuming a
uniform canopy.
Jensen, 2008
19
LIDAR Laser and Scanning System
LIDAR data may be used to prepare digital terrain
or digital surface models such as the one shown
which was used to identify the optimum location
for a new railroad line near Aiken, SC.
Jensen, 2008
20
LIDAR Laser and Scanning System
  • LIDAR remote sensing avoids the problems of
    aerial triangulation and orthorectification
    because each LIDAR posting is individually
    georeferenced. However, it takes substantial
    processing to turn the laser range information
    into georeferenced masspoints. Some of the most
    important variables used in the processing
    include
  • x,y,z location of laser in 3-dimensional space
    at the time of the laser pulse
  • attitude (roll, pitch, and heading) of the
    laser at the time of the laser pulse
  • scan angle of the LIDAR at the time of the
    laser pulse
  • effect of atmospheric refraction on the speed
    of light
  • laser pulse travel time from the LIDAR
    instrument to the target (ground) and back.

21
LIDAR Laser Location
  • It is important to know the exact location of the
    LIDAR laser at all times during data collection.
    This is accomplished using Differential Global
    Positioning System (DGPS) technology. DGPS is
    based on the use of two GPS receivers that
    simultaneously record positional information.
  • A terrestrial GPS base station is located at an
    accurately surveyed location with well-documented
    x, y, and z-coordinates. The terrestrial base
    station records its GPS position for the duration
    of the LIDAR data collection mission.
  • A second GPS receiver is located on the aircraft
    and records the LIDAR antenna position at all
    times during data collection.

22
LIDAR Laser Location
After the LIDAR data are collected, the data from
both GPS units (one on the ground and one in the
aircraft) are post-processed along with the known
location of the base station antenna. This
process determines the exact location of the
aircrafts antenna for the entire flight. The
accuracy of the aircraft position is typically lt
5 to 10 cm, and is output in units of latitude,
longitude, and ellipsoidal height in a WGS 84
coordinate system.
23
LIDAR Antenna Attitude (Orientation)
It is necessary to have accurate LIDAR antenna
orientation information (roll, pitch, and
heading) at all times during data collection.
This is measured by an inertial measurement unit
(IMU). The IMU uses roll, pitch, and yaw
gyroscopes and accelerometers to measure the
orientation of the LIDAR antenna at the exact
moment every pulse is transmitted and received.
After the LIDAR data are collected, data from the
IMU is post-processed along with the GPS-derived
antenna position data to output a file indicating
the trajectory of the aircraft and the laser
antenna at all times during the LIDAR mission.
The output file documents the position of the
aircraft (latitude, longitude, and ellipsoidal
height) and sensor orientation (roll, pitch, and
heading), indexed by GPS time.
24
LIDAR Post-Processing of Multiple Returns
Thus far we have collected GPS data, IMU data,
and sent and received laser pulses. How are these
data turned into digital elevation values at
specific x,y and z locations on the surface of
the Earth? This is accomplished through LIDAR
post-processing which takes place after the
aircraft has landed. Post-processing software
is used to associate 1) LIDAR antenna x,y,z
position, 2) antenna roll, pitch, and yaw
orientation, and 3) LIDAR range (distance)
information into a set of latitude, longitude,
and altitude (x,y,z) coordinates for each LIDAR
return. The output is typically in a compact
binary format of WGS 84 coordinates, with options
for converting the output to ASCII formats,
and/or UTM coordinates.
25
LIDAR Returns
A pulse of laser energy exiting the transmitter
is directed toward the terrain at a certain angle
by the rotating mirror. Depending upon the
altitude of the LIDAR instrument AGL and the
angle at which the pulse is sent, each pulse
illuminates a near-circular area on the ground
called the instantaneous laser footprint, e.g.,
30 cm in diameter. This single pulse can generate
one return or multiple returns. The figure
depicts how multiple returns might be produced
from a single pulse. All of the energy within
laser pulse A interacts with the ground. One
would assume that this would generate only a
single return. However, if there are any
materials whatsoever with local relief within the
instantaneous laser footprint (e.g., grass, small
rocks, twigs), then there will be multiple
returns. The 1st return will come from these
materials that have local relief (even on the
order of 3 to 5 cm) and the 2nd and perhaps last
return will come from the bare-Earth. Although
not identical, the range (distance) associated
with the first and last returns will be very
similar.
26
LIDAR Return Logic
  • 1st return
  • n intermediate returns
  • Last return

27
LIDAR Returns
Laser pulse B encounters two parts of a tree at
different elevations and then the bare Earth. In
the example, part of pulse B encounters a branch
at 3 m AGL causing some of the incident laser
pulse to be backscattered toward the LIDAR
receiver. This is recorded as the 1st return. The
remainder of the pulse continues until at 2 m AGL
it encounters another branch that scatters energy
back toward the LIDAR receiver. This is recorded
as the 2nd return. In this example, approximately
one-half of the pulse finally reaches the ground,
and some of it is backscattered toward the LIDAR
receiver. This is the last return. If we wanted
information about the height of the tree and its
structural characteristics then we would be
interested in the 1st, 2nd, and last return
associated with pulse B. If we are only
interested in creating a bare-Earth digital
terrain model then we would be interested in the
last return associated with pulses A and B.
28
LIDAR Returns
  • Thus, each laser pulse transmitted from the
    aircraft can yield multiple returns. This is
    referred to as multiple-return LIDAR data.
    Post-processing the original data results in
    several LIDAR files commonly referred to as
  • 1st return
  • possible intermediate returns
  • last return and
  • intensity.
  • The masspoints associated with each return file
    (e.g., 1st return) are distributed throughout the
    landscape at various densities depending upon the
    scan angle, the number of pulses per second
    transmitted (e.g., 50,000 pps), aircraft speed,
    and the materials that the laser pulses
    encountered. Areas on the ground that do not
    yield any LIDAR-return data are referred to as
    data voids.

29
LIDAR Returns
  • LIDAR data collection vendors deliver LIDAR data
    according to user specifications. For example,
    the Savannah River Site LIDAR return data were
    delivered in ASCII format as
  • time of day, x-coordinate, y-coordinate,
    z-coordinate, and intensity.
  • The ASCII format makes it easy to input the LIDAR
    data into a GIS for examination and analysis. The
    LIDAR returns were processed by the vendor into
    separate files of
  • 1st returns
  • last returns and
  • bare-Earth returns.
  • Bare-Earth returns were derived from the first
    and last returns using a post-processing
    procedure (to be discussed).

30
Extraction of First, Intermediate, and/or Last
Return Digital Surface Models
Masspoints associated with the last returns from
the LIDAR mission flown over the Savannah River
Site on October 10, 2004 are shown in Figure
10-3a. Each masspoint has a unique x,y location.
The last-return dataset contains points derived
from multiple, overlapping flightlines of
last-return LIDAR data. We could click on the
individual points in Figure 10-3a and obtain the
elevation. This is useful but does not reveal any
elevation patterns in the dataset. Therefore, it
is common to use digital image processing
techniques to enhance our understanding of the
masspoint dataset. For example, the individual
masspoints were processed using inverse distance
weighting (IDW) interpolation logic to create a
raster (grid) of elevation values every 0.25 x
0.25 m (Figure 10-3b). The interpolation process
creates a digital surface model (DSM) that
contains the elevation characteristics of all the
trees, shrubs, and man-made structures. The
brighter the pixel in the DSM, the greater the
elevation. For example, the buildings in Figure
10-3b are higher than the surrounding ground
therefore the buildings are brighter than the
ground. The original masspoints have been
overlaid onto the IDW digital surface model for
illustrative purposes in Figure 10-3b.
31
(No Transcript)
32
Extraction of First, Intermediate, and/or Last
Return Digital Surface Models
The LIDARderived IDW DSM can be made even easier
to interpret by applying a shaded-relief
algorithm that highlights the terrain as if it
were illuminated by the Sun from a specific
direction (e.g., from the northwest). An example
is presented in Figure 10-3c. The original
masspoints have been overlaid on the
shaded-relief display for illustrative purposes.
Draping masspoints onto orthophotography is also
very useful (Figure 10-3d).
33
(No Transcript)
34
Extraction of First, Intermediate, and/or Last
Return Digital Surface Models
Most LIDAR projects generate so much data that it
is necessary to subdivide the dataset into tiles.
We have been examining a small portion of one
tile. The first- and last-return digital surface
models for an entire tile are shown in Figure
10-4ab. This graphic depicts both the first- and
last-return data as inverse distance weighted
(IDW) digital surface models (DSM). First- and
last-return IDW DSMs are then portrayed in
shaded-relief format.
35
(No Transcript)
36
First Return
Last Return
Bare Earth
37
Extraction of Bare-Earth Digital Terrain Models
If the purpose of a LIDAR overflight is to
collect data to create a digital terrain model,
the presence of vegetation (and other surface
obstructions) can be a nuisance. In areas covered
by dense vegetation, the majority of the LIDAR
returns will be from the canopy, with only a few
pulses reaching the ground. Hendrix (1999) found
that up to 93 of LIDAR pulses never reached the
ground in mixed bottomland hardwoods near Aiken,
SC. Separating ground returns from vegetation
canopy returns can be problematic. Nevertheless,
it can be done.
38
Extraction of Bare-Earth Digital Terrain Models
A bare-Earth digital terrain model (DTM) may be
created by systematically removing masspoints in
the first, intermediate, and/or last return LIDAR
data that come from trees, shrubs, and even grass
that extend above the bare ground. This procedure
is typically performed in two steps 1)
semi-automatic masspoint filtering, and 2)
manual masspoint editing.
39
Semiautomatic Vegetation and/or Man-made
Structure Masspoint Filtering
Many landscapes contain dense vegetation. If the
goal is to produce a bare-Earth digital terrain
model, then it is imperative to have a technique
that removes 90 98 of the above-ground
vegetation masspoints. This is done using a
filtering algorithm that systematically passes
through the LIDAR dataset examining each
masspoint and the elevation characteristics
associated with its n nearest-neighbors. The
filter then identifies those points that are a)
bare ground, b) scrub-shrub, c) trees, and/or d)
man-made structures.
40
Manual Vegetation and/or Man-made Structure
Masspoint Editing
Semiautomatic filtering rarely identifies all
masspoints associated with shrubs, trees, and/or
man-made objects. It is usually necessary for a
well-trained analyst to visually examine the
results of applying the semiautomatic filter and
then identify and set aside any above-ground
masspoints that were not detected
semi-automatically. This approach may involve
viewing the point clouds in 3-dimensions or
overlaying the masspoints on rectified aerial
photography. The manually edited masspoints can
then be used to create a bare-Earth digital
terrain model (DTM). Figures 10-4c and 10-4f
depict a bare-Earth DTM in which all of the
vegetation and buildings have been removed.
Figure 10-5a presents an enlarged view of a small
part of the area shown in Figure 10-4c and 10-4f
with the filtered bare-Earth masspoints overlaid
on the last return IDW DSM. Note the data voids
on the building rooftops where the masspoints
were deleted by automated and/or manual
filtering. The bare-Earth masspoints can then be
processed to yield a DTM that contains no
buildings as shown in Figure 10-5d. Note also
that almost all of the vegetation present in the
first- and last-return LIDAR data (Figures
10-4a,b and 10-4d,e) is gone in the bare-Earth
DTM shown in Figure 10-4c and 10-4f.
41
(No Transcript)
42
Manual Vegetation and/or Man-made Structure
Masspoint Editing
It is important to note that the final bare-Earth
DTM may or may not include man-made structures
such as buildings. For example, if we are
concerned with creating a DTM that will be used
to model the flow of water on the ground, then we
would not want to use the DTM shown in Figure
10-4f (and Figure 10-5d) because the buildings
have been removed. Instead, we would want to
remove all above-ground vegetation and then
create the DTM. The DTM would contain bare-Earth
and building elevation information. A hydrologic
model using this DTM would flow water correctly
around buildings, not through them.
43
Orthophotograph 1 x 1 m
First Return rasterized using IDW
44
First Return rasterized using IDW
First Return analytical hill-shading
45
Last Return rasterized using IDW
Last Return analytical hill-shading
46
First Return rasterized using IDW
Last Return rasterized using IDW
47
Bare Earth rasterized using IDW
Bare Earth analytical hill-shading
48
LIDAR Intensity
Most LIDAR systems provide an intensity file in
addition to the multiple return data. The
recorded intensity is in most cases not the
integration of the returned echo from all the
pulse returns, but just its maximum. Several
factors influence intensity. First, laser light
is monochromatic and has an extremely small
bandwidth (e.g., 2 5 nm) usually centered on a
near-infrared wavelength (e.g., 1046 nm).
Conversely, multispectral remote sensing
bandwidths are often 50 100 nm wide although
hyperspectral sensors might have bandwidths as
small as 10 nm. Other significant factors include
the range to the target, angle of incidence and
atmospheric dispersion. In addition, the system
controller also records the state of the AGC
(automatic gain control). The AGC circuit adjusts
the return signal gain in response to changes in
target reflectance. The gain and intensity values
may vary over a scene and from day to day. This
variability in gain control can cause problems
when interpreting intensity data.
49
LIDAR Intensity
Theoretically, one would think that because the
laser uses near-infrared light that the intensity
value at each of the masspoint locations would in
effect be similar to the reflectance found when
conducting optical remote sensing in the
near-infrared portion of the spectrum.
Unfortunately, this is not the case. For example,
Figure 10-6a presents a DSM produced using IDW
applied to last-return LIDAR data. Figure 10-6b
is an intensity DSM produced using IDW. The
intensity image contains a wealth of detail and
in many respects looks like a panchromatic aerial
photograph of the study area. However, there are
interesting anomalies when we compare the
intensity image to what we would expect to find
in a typical black-and-white near-infrared image.
50
(No Transcript)
51
LIDAR Intensity
First, consider the large forested area shown at
a in Figure 10-6b. This is a mature forest that
has substantial local relief as evidenced by the
bright return in the adjacent last return dataset
(Figure 10-6a). We would normally expect this
area to appear bright in a black-and-white
near-infrared image. Conversely, it appears quite
dark in the intensity image. Similarly,
consider several individual deciduous trees at
location b in Figure 10-6b and shown in Figure
10-7. These trees also exhibit less intensity
than expected. The natural grass surrounding
these trees has a higher intensity range than the
trees (Figure 10-7c).
52
First Return
Last Return
Intensity
53
(No Transcript)
54
Color-coded intensity
Intensity
1st return elevation
Color-coded intensity draped onto 1st return
elevation
55
Contours
Sometimes it is valuable to extract contours
(lines of equal elevation) from DSMs or DTMs to
highlight subtle differences in the terrain and
to identify depressions. For example, consider
the testpad mounds located at c in Figure 10-6b.
These testpads were engineered specifically to
test the effectiveness of various clay-cap
materials used to protect subsurface hazardous
materials. Various types of impermeable
polyurethane barriers were placed on the
testpads, covered with clay, and planted with
centipede grass. The surfaces were then left to
the ravages of nature. LIDAR first-return data of
the testpads processed using IDW are presented in
Figure 10-8a. A shaded-relief version of the IDW
DSM is shown in Figure 10-8b. Note that one-half
of the trees were removed in the northern half of
three of the testpads and not on the eastern-most
control testpad. The cleared areas on the three
testpads were sodded with new centipede grass at
the time of data collection.
56
Bare Earth rasterized using IDW
Bare Earth analytical hill-shading
57
LIDAR-derived TIN Bare Earth DEM overlaid with
Contours
25 cm contours
58
Accuracy of LIDAR Measurements
LIDAR is a relatively new technology. As such,
there is healthy skepticism about its ability to
provide x,y, and z-elevation information as
accurately as traditional in situ surveying and
photogrammetry. Fortunately, there are accuracy
assessment standards that can be used to provide
an unbiased assessment of the accuracy of
LIDARderived products.
59
Accuracy of LIDAR Measurements
To determine the accuracy of a LIDARderived
digital surface model (DSM) or bare-Earth digital
terrain model (DTM), it is customary to identify
in situ x,y,z checkpoints throughout the study
area using a higher accuracy technique, such as
total station surveying or differential GPS. Each
in situ checkpoint is then located in the
LIDARderived DSM or DTM. The nearest
LIDARderived information is then compared with
the location and elevation information associated
with each in situ checkpoint. This is performed
for a number of checkpoints. The results are then
used to compute the horizontal and vertical
accuracy of the LIDARderived data expressed as
the root mean squared error (RMSE). In 1998,
the Federal Geographic Data Committee (FGDC)
published the Geospatial Position Accuracy
Standards Part 3 National Standard for Spatial
Data Accuracy (NSSDA) (FGDC, 1998). This standard
replaced both the United States National Map
Accuracy Standards (NMAS) published by the Office
of Management and Budget in 1947 and the American
Society for Photogrammetry and Remote Sensing
(ASPRS) ASPRS Accuracy Standards for Large-Scale
Maps published in 1990.
60
The Geospatial Accuracy Standard for horizontal
and vertical accuracy of spatial products put
forth by the FGDC is based on the computation of
RMSE where and ? is the difference between
an in situ checkpoint measurement and a remote
sensingderived measurement at the same location.
Horizontal accuracy (i.e., circular standard
error) assessment at the 95 confidence level is
computed using Vertical accuracy assessment at
the 95 confidence level is computed
using Horizontal accuracy was 6.6 cm. Vertical
accuracy was 12.9 cm.
61
  • Check points
  • Sanborn (95 points)
  • NASA Verification
  • and Validation Team

62
LIDAR (LIght Detection and Ranging)
Case Study
63
AS350BA LIDAR Platform
64
Sensor Systems and LIDAR
65
3D Visualization of LIDAR Data
Vertical Exaggeration Factor 5
66
Bridgestone/Firestone Tire Plant Aiken,
SC 112,000 color infrared aerial photography
obtained on August 21, 1998
67
Model Logic
68
Total Cost Surface and Optimal Route
Vertical Exaggeration Factor .01
69
Routes Derived from Traditional Methods and from
the Optimal Path Model
Vertical Exaggeration Factor 5
70
Sensor Systems and LIDAR
71
  • Classification of Landcover based solely on
    LIDAR-derived Elevation, Slope, and Intensity
  • blue buildings
  • green grass
  • pink vegetation
Write a Comment
User Comments (0)
About PowerShow.com