ERS186: Environmental Remote Sensing - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

ERS186: Environmental Remote Sensing

Description:

... the onset of a strong wind, the irradiance on a deciduous forest floor increases ... RT models can be: ... invertible models: typically designed ... – PowerPoint PPT presentation

Number of Views:93
Avg rating:3.0/5.0
Slides: 44
Provided by: JonathanG
Category:

less

Transcript and Presenter's Notes

Title: ERS186: Environmental Remote Sensing


1
ERS186Environmental Remote Sensing
  • Lecture 11
  • The remote sensing process and the analysis of
    continuous and nominal variables

2
Outline
  • Conceptual basis of remote sensing research
  • Physical interpretation of RS data
  • Empirical interpretation of RS data
  • Continuous variables
  • Simple regression models
  • Linear spectral unmixing
  • Nominal variables
  • Classification techniques

3
The Goal of Remote Sensing
Remote Sensing Data
Two types of models
Information
physical and empirical
4
The state variable approach to modeling...a
physical approach
Model based on State Variables
Some relationship
Data
Knowledge
Estimation of the variable of interest
Output (information)
5
Radiative Transfer State Variables
Remote Sensing Data
RT State Variables
Direct relationship
  • RT state variables the smallest set of variables
    needed to fully describe the RS data
  • Type(s) of media atmosphere, vegetation, soil,
    etc
  • Factor 1 Spectral scattering, transmission,
    absorption properties of media (These are
    functions of time!)
  • Factor 2 Architectural properties of the media
    position, size, shape, orientation, density
    (These are functions of time!)
  • Factor 3 View and illumination directions

6
Aside Why is time involved?Examples...
  • - At the onset of moisture stress soybean and
    cotton leaves droop, corn leaves roll into
    vertical cylinders.
  • - With the onset of a strong wind, the
    irradiance on a deciduous forest floor increases
    dramatically as leaves minimize aerodynamic drag
    rather than maximize light interception.
  • In general, plant canopies grow and develop
    during a growing season thus, their architecture
    and often the spectral properties of their
    components change over time.

In general Factors 1 and 2 are stochastic
variables i.e. functions of time.
7
An physical RT modeling approach
Remotely Sensed Data
RT State Variable Model
Direct relationship
knowledge
Estimation of Variable of Interest
Output (information)
8
Physical RT Models
Remote Sensing Data
RT State Variable model
Direct relationship
Some relationship
  • If the variable of interest does NOT directly
    affect the RT state variables, RS alone is not
    sufficient to retrieve information on the
    variable of interest from a physical
    interpretation. Examples
  • Bird nesting locations
  • Human population densities
  • Rooting depth of plants
  • Tsetse fly infestations
  • Note most of variables of interest we have
    covered in this class DO directly affect the RT
    state variables or ARE state variables
    themselves, which is why we covered them!

Estimation of Variable of Interest
Output
9
Physical RT Models
Remote Sensing Data
RT State Variables
RT Models
Invertible models
  • Radiative transfer models
  • Try to predict RS data based on a function of the
    RT state variables
  • RT models can be
  • -Economically invertible models typically
    designed for simple scenes, have a few number of
    state variables
  • -Non-economically invertible models typically
    designed for complex scenes, have a large number
    of state variables

Some relationship
Variable of Interest
10
Empirical Models
Variable of Interest
Remote Sensing Data
Empirical relationship
  • Empirical (statistical) relationships constitute
    the BULK of RS analysis.
  • These analyses allow us to determine IF there is
    a relationship, not WHY there is a relationship.
  • Two types of variables of interest
  • Biophysical variables RT state variables and
    functions of RT state variables (most the
    variables covered in this class)
  • Hybrid variables function of at least 1 non-RT
    state variable

11
Biophysical Variables
  • Examples of common biophysical variables that
    affect RT
  • Vegetation Factor 1 pigment concentration,
    foliar water content, Factor 2 LAI, biomass
  • Temperature
  • Soil moisture Factor 1
  • Surface roughness Factor 2
  • Evapotranspiration
  • Atmosphere chemistry, temperature, water vapor,
    wind speed/direction, energy inputs,
    precipitation, cloud and aerosol properties
  • Ocean color, phytoplankton, chemistry
  • Snow and sea ice characteristics
  • Spatial x,y, and potentially z
  • BRDF Factor 3
  • Temporal time during (day, season, year) that
    the image was acquired

12
Biophysical Variables
  • These variables WILL affect RS data, but not
    necessarily in a repeatable or useful way because
    other state variables are present affecting the
    RS data.
  • Repeatability limitations. Example - Liquid
    water content in cotton changes in LAI, leaf
    orientation, background soil properties,
    atmospheric affects will make an empirically
    determined relationship between liquid water
    content and RS data extracted from scene
    difficult to apply to another scene without
    controlling for those other RT state variables.
  • Usefulness limitations. Example - LAI we know
    LAI affects RS data, but we can not reliably
    estimate high LAIs using current analysis
    technology and techniques.

13
Continuous Relationships
  • Question How much of (some variable of interest)
    is present in a pixel?
  • Methods
  • Collect field data on variable of interest
  • Determine empirical relationship between RS data
    to field data
  • Relationship determination can take an extremely
    wide range of methods, from regression to neural
    network to complex model formulation, etc
  • Invert relationship on entire RS scene

14
Case Study Cotton Water
  • Question what is the canopy water content of a
    pixel of cotton?
  • Methods
  • Collected leaf water potential (LWP) on cotton
    leaves and GPS coordinates of those leaves.
  • Determined the continuum of the water absorption
    feature at 975nm and 1150nm and regressed this
    against LWP data for the appropriate pixels.
  • The regression gives me a model (f) of LWPf(CR),
    so I can apply the model to an entire AVIRIS
    scene, and each pixel will be the estimated LWP.

15
Biophysical Variable
16
Field vs. RS Relationship
  • Found a relationship (albeit tenuous) between the
    field measurements and the RS measurements.
  • The deeper the absorption feature, the higher the
    LWP.
  • We generate an equation of the line that fits the
    data, which can be inverted on the image data to
    produce LWP from a given CR value.

17
Cotton field LWP
  • Cooler colors indicate higher LWP, hotter colors
    indicate lower LWP.
  • Notice the variation in the cotton field. A
    farmer might want to water the center of the
    field more than the top and bottom.

18
Limitations
  • Can I apply these results to a different species?
  • Can I apply these results to cotton at different
    ages?
  • Can I apply these results to cotton at different
    times of the day?

19
Unmixing
  • Question what are the media present in a pixel,
    and how much of a pixel is comprised of a given
    media?

20
Pure vs. Mixed Pixels
  • In the class, so far, we have mainly dealt with
    pure pixels (e.g. pixels in which there is one
    type of material).
  • When do you find pure pixels?
  • When the spatial extent of the material is larger
    than the size of the pixel. Examples
  • Large clouds and 1 km. GOES pixels
  • Mineral deposits and 20 m. AVIRIS pixels
  • Leaves and an integrating sphere spectrometer

21
Pure vs. Mixed Pixels
  • Types of mixtures (from Geology lecture)
  • Areal
  • Intimate
  • Coating
  • Molecular
  • Mixed pixels typically refer to areal or intimate
    mixtures

22
This mixed pixel contains ...
square pixel
Bare Soil
Tree
River
Tree shadow
Grass
23
Unmixing Pixels
  • We want to determine the fraction of each
    endmember in a potentially mixed pixel.
  • Endmember pure reflectance spectra of a pixel
    component, measured in the lab, in the field, or
    from the image itself.
  • Examples of commonly used endmembers green
    vegetation, soil, shadow, water, clouds,
    non-photosynthetic vegetation (NPV, wood,
    decayed leaves, etc.)

24
Linear Spectral Unmixing
  • Basic assumption the reflectance of a pixel is a
    linear combination of the endmember spectra times
    their relative cover fraction.
  • Two parts to the algorithm
  • Fifraction of endmember i in pixel (usually
    0Fi1)
  • DN?the pixel reflectance for band ?
  • DN?,ithe reflectance for band ? of endmember I
  • E?error term

25
Linear Spectral Unmixing, LSU
  • For each spectral band, there is a different
    version of equation (2)
  • If the number of bands 1 is equal to the number
    of endmembers, we can solve the set of equations
    without an error term.
  • If the number of bands 1 is greater than the
    number of endmembers, we can solves the set of
    equations and generate an error term.
  • This set of equations does not have a unique
    solution if there are more endmembers than bands.
  • Since DN? is known (from the image) and DN?,i are
    known (from lab, field, or image spectra), we can
    determine Fi and E? (if i lt (B 1))!

26
Linear Spectral Unmixing, Results
Shadow
Soil
Vegetation
Greenberg, unpublished Endmember fractions of
Vegetation, Shadow, Soil. Shadow is related to
the structure of the pixel more heterogenous
canopies yield greater shadow. Nearly all
human-affected pixels (regardless of type!) will
have LOW shadow. Old forests will have HIGH
shadow.
27
LSU Shortcomings
  • Because of multiple scattering, BRDF factors, and
    other issues, rarely are pixels composed of
    linear mixtures of individual components. These
    are mainly 3-d structural factors.
  • The higher the vertical complexity in a pixel,
    the less likely the fractions will represent
    cover. Vegetation cover is often overestimated
    in LSU.

28
Classification
  • Classification is one of the most widely used
    analysis techniques in RS.
  • Spectral space ltgtInformation space
  • Good classification often relies on a good
    understanding of the RT state variables present
    and how they affect a class.
  • If two classes are identical in spectral space,
    then classification accuracy will be low.

29
Classification
  • Three types of classification
  • Supervised
  • Requires training pixels, pixels where both the
    spectral values and the class is known.
  • Unsupervised
  • No extraneous data is used classes are
    determined purely on difference in spectral
    values.
  • Hybrid
  • Use unsupervised and supervised classification
    together
  • Useful fact we arent limited to using only raw
    DNs, radiance, or reflectance in our classifier.
    We can use ratio or difference indices, LSU
    fractions, spatial data (distance from some
    target) or any other data transformation we might
    think would be appropriate in the classifier.

30
Supervised Classification
  • Steps
  • Decide on classes.
  • Choose training pixels which represent these
    classes.
  • Use the training data to train the classifier.
  • Then classify each pixel in the image using the
    trained classifier.
  • The result? Each pixel is labeled as belonging
    to one of classes - or to other.

31
Many types of Classifiers
  • A short list of examples (We will cover some of
    these in more detail next quarter).
  • Table look up
  • Parallelepiped
  • Minimum distance
  • Maximum likelihood
  • Layer
  • Spatial

32
Table Look Up Classifier
How it works ...
  • For each class, a table of band DNs are produced
    with their corresponding classes.
  • For each image pixel, the image DNs are matched
    against the table to generate the class.
  • If the combination of band DNs is not found, the
    class can not be determined.
  • Benefits conceptually easy and computationally
    fast.
  • Drawbacks relatively useless, unless every
    potential combination of band DNs and their class
    is known.

33
Table Look Up Classifier...
How it works ...
34
Parallelepiped Classifier
How it works ...
  • The minimum and maximum DNs for each class are
    determined and are used as thresholds for
    classifying the image.
  • Benefits simple to train and use,
    computationally fast
  • Drawbacks pixels in the gaps between the
    parallelepipes can not be classified pixels in
    the region of overlapping parallelepipes can not
    be classified.

35
Parallelepiped Classifier
How it works ...
36
Minimum Distance Classifier
How it works ...
  • A centroid for each class is determined from
    the data by calculating the mean value by band
    for each class. For each image pixel, the
    distance in n-dimensional distance to each of
    these centroids is calculated, and the closest
    centroid determines the class.
  • Benefits mathematically simple and
    computationally efficient
  • Drawback insensitive to different degrees of
    variance in spectral response data.

37
Maximum Likelihood Classifier
How it works ...
  • Max likelihood uses the variance and covariance
    in class spectra to determine classification
    scheme.
  • It often, but not always, assumes that the
    spectral responses for a given class are normally
    distributed.

38
Maximum Likelihood Classifier
How it works ...
  • We can then determine a probability that a given
    DN is a member of each class. The pixel is
    classified by using the most likely class or
    Other if the probability isnt over some
    threshold.
  • Benefits takes variation in spectral response
    into consideration.
  • Drawbacks computationally intensive multimodal
    or non-normally distributed classes require extra
    care when training the classifier, if high
    accuracy is to be achieved.

39
Study question Use each of the first four
classifiers to establish decision boundaries for
each class shown. Case 1, separate means, equal
variance
Spectral band 2. NIR, 0.8 to 0.9 um
Spectral band 1. For example, red, 0.6 to 0.7 um
40
Study question Use each of the first four
classifiers to establish decision boundaries for
each class shown. Case 2, equal means, unequal
variances
Spectral band 2. NIR, 0.8 to 0.9 um
Spectral band 1. For example, red, 0.6 to 0.7 um
41
Study question Use each of the first four
classifiers to establish decision boundaries for
each class shown. Case 3, bimodal distribution
Spectral band 2. NIR 0.8 to 0.9 um
Spectral band 1. For example, red, 0.6 to 0.7 um
42
Study question Use each of the first four
classifiers to establish decision boundaries for
each class shown. Case 4, separate means,
highly correlated data
Spectral band 2. NIR, 0.8 to 0.9 um
Spectral band 1. For example, red, 0.6 to 0.7 um
43
Study question Use each of the first four
classifiers to establish decision boundaries for
each class shown. Case 5, separate means,
uncorrelated data
Spectral band 2. NIR, 0.8 to 0.9 um
Spectral band 1. For example, red, 0.6 to 0.7 um
Write a Comment
User Comments (0)
About PowerShow.com