Terrestrial Ecology and Land Cover/Change Science Review: Basis of Successful Land Product Validation - PowerPoint PPT Presentation

About This Presentation
Title:

Terrestrial Ecology and Land Cover/Change Science Review: Basis of Successful Land Product Validation

Description:

Terrestrial Ecology and Land CoverChange Science Review: Basis of Successful Land Product Validation – PowerPoint PPT presentation

Number of Views:86
Avg rating:3.0/5.0
Slides: 32
Provided by: jeffpr
Learn more at: https://daac.ornl.gov
Category:

less

Transcript and Presenter's Notes

Title: Terrestrial Ecology and Land Cover/Change Science Review: Basis of Successful Land Product Validation


1
Terrestrial Ecology and Land Cover/Change Science
ReviewBasis of Successful Land Product
Validation
  • Jeffrey L. Privette
  • NASAs Goddard Space Flight Center
  • Warren Cohen, John Dwyer, S. Tom Gower, Simon
    Hook, Chris Justice, Alfredo Huete, Jeff
    Morisette, Ranga Myneni, Dick Olson, Steve
    Running, David Starr

2
Challenge
  • One of the five primary objectives of the Earth
    Science Enterprise
  • Identify and measure the primary causes of
    change
  • Impossible to accomplish without knowledge of
    measurement uncertainties
  • No precedent for global validation of land
    products

3
What We Knew Going In
  • Excellent knowledge of point issues
  • Limited knowledge of site (25 km2) issues
  • spatial and temporal sampling
  • scaling of point measurements
  • error budgeting
  • Poor knowledge of global sampling issues
  • except inherently cost-prohibitive without
    strategic planning

4
EOS Response
  • EOS Validation Program Office/D. Starr
  • Guidance from Instrument Teams
  • Priorities (e.g., low level products first)
  • Approaches
  • Community resources
  • 1997 NRA for EOS Validation Investigators
  • Shared responsibility with product developers

5
Strategic Plan 1Focus Sites
  • EOS Land Validation Core Sites
  • Facilitate resource sharing and synergies in
    validating multiple sensors and products
  • Built on existing stations, resources, experts
  • FLUXNET, AERONET, LTER, etc.
  • Long-term duration
  • Stratified by 6 biomes of Myneni/Running
  • Jointly nominated by Instrument and (newly
    selected) Validation PIs

6
EOS Land Validation Core Sites
7
Strategic Plan 2Simple, Fast Data Provision
  • Solutions
  • Data stockpiles for EOS Core Sites
  • Unique online ftp directories at EROS Data Center
  • 200 km x 200 km MODLAND product subsets
  • Landsat 7 ETM scenes (coordinated by MODLAND)
  • SeaWiFS, ASTER, UMD/CRESS, others joined
  • Centralized field data registry
  • WWW metadata catalog and data linking at ORNL
  • Field investigator upkeep and control of native
    data
  • Accommodates 6-month public release policy
    forEOS Validation Investigators

8
EOS Land Product Validation FTP Server
  • Accessible from Land
  • Validation Web Site
  • or ORNL DAAC Mercury

/1999
/2000
/2001
/site name
/site name
/site name
http//modis-land.gsfc.nasa.gov/val/ http//mercur
y.ornl.gov/
/GLCTS
/Landsat7
/Geocover
/MODIS
/ASTER
AST_L1B
Level-0R Level-1G
MOD09A1 MOD13A2 MOD43B3
9
Mercury Searching
1. Fielded Search (using picklists for Project,
Site PI, Parameter, etc.)
or enter specific coordinates
2. Spatial Search (use cursor to define area of
interest)
3. Temporal Search
10
Strategic Plan 3International Buy-in
  • Committee on Earth Observing Satellites (CEOS)
  • Composed of the worlds space agencies
  • Coordinates civilian observations from space
  • Working Group for Calibration and Validation
    (WGCV)
  • Facilitates and encourages standards,
    efficiencies, best practices
  • Four Subgroups
  • Synthetic Aperture Radar
  • Microwave Sensors
  • Infrared and Visible Optical Sensors (IVOS)
  • Terrain Mapping
  • No mechanism for coordinating or advancing
    validation of higher level land products

Calibration emphasis
11
Strategic Plan 3International Buy-in
  • Committee on Earth Observing Satellites (CEOS)
  • Composed of the worlds space agencies
  • Coordinates civilian observations from space
  • Working Group for Calibration and Validation
    (WGCV)
  • Facilitates and encourages standards,
    efficiencies, best practices
  • Five Subgroups
  • Synthetic Aperture Radar
  • Microwave Sensors
  • Infrared and Visible Optical Sensors (IVOS)
  • Terrain Mapping
  • Land Product Validation

Calibration emphasis
12
WGCV Subgroup onLand Product Validation
  • Chartered in 2000 with strong NASA advocacy
  • Founding focus Support of Global Observations
    of Forest Cover (GOFC/GOLD) needs (fire, land
    cover and biophysical products)
  • Multinational endorsement and participation
  • particularly from Europe and Japan (ENVISAT,
    ADEOS2/GLI/ POLDER2)

13
Case Study 1 ASTER/Hook
  • MODIS and ASTER thermal infrared systems and
    products
  • Approach
  • Verification of sensor calibration at simple
    sites
  • Determination of lower- to higher level product
    uncertainties at realistic sites
  • Data from ground, aircraft, other sensor and
    lunar looks
  • Sites 6 in U.S., 3 in Australia

14
(No Transcript)
15
ASTER Spectral Calibration Accuracy with Time
16
Case Study IIBigFoot/FLUXNET
17
BigFoot Site-based Sampling and Modeling
Methodology
18
BigFoot Site-based Sampling and Modeling
Methodology
  • Parameters v. Products
  • Landcover 1,2
  • LAI 1,2
  • FPAR 1,2
  • NPP 1,2
  • GPP 1,3,4
  • NEP 1,3,4
  • 1 MODIS
  • 2 field measure
  • 3 modeled
  • 4 tower/eddy flux

19
(No Transcript)
20
Example Comparing Landsat and MODIS LAI Products
21
Generalizing BigFoots Approach Over FLUXNET
MODIS PRODUCTS MOD-17 8-d GPP MOD-17 ANPP
VALIDATION
Based on Nemani/2/28/00
22
Online Access to FLUXNET Data
FLUXNET Validation Web Site http//daac.ornl.gov/
FLUXNET/data.html
23
Case Study IIILAI Network/Myneni
  • Joint activity of CEOS LPV MODLAND
  • MODIS LAI and FPAR
  • Open-invitation workshops from 1998- to 2000.
    Findings
  • Several foreign groups sampling LAI, some
    addressing scaling
  • Strong interest in cross-comparisons, technical
    exchange and trouble-shooting
  • Regional networks could be developed for MODLAND
    LAI validation needs

24
(No Transcript)
25
LAI Product Evaluation
  • 12-18 month pathfinder
  • Leverage existing 2000-2001 field campaigns
  • Workshop Rome, June 2001
  • MODLAND coordinating Landsat ETM acquisitions
    and MODLAND LAI subsets

26
LAI Product Evaluation Participating Sites
27
Validation What Works?
  • Dedicated science team liaisons for validation
  • Simplified data access (dedicated archives)
  • Strong communication between Product PIs and
    Validation PIs
  • Algorithm or product familiarity
  • Strong attention to error budgeting
  • Satellite constellations
  • International partnerships and coordination
  • Networks, networks, networks
  • Duration (mission succession), consistency

28
Validation To Be Resolved
  • Global Representivity
  • How many points in time, space and biome?
  • Scaling
  • Infancy for many products
  • Communication of Results
  • Formal venues lacking
  • User Input
  • Better knowledge greater cost
  • How much is enough?
  • Uncertainty Embedding
  • Product data layer with pixel-by-pixel
    uncertainties

29
Under-represented Climate Combinations for
North America and Asia
30
Concerns and Recommendation
  • EOS underscored two relevant lessons
  • Sensor characteristics change throughout their
    lifetimes, including prelaunch
  • Algorithms change throughout their lifetimes
  • Reprocessing happens.
  • Throughout NPP/NPOESS era (present to 2020),
    NASA is purchasing products from industry
  • Is NASA ready to conduct performance
    verification?
  • Could NASA credibly challenge a vendor on an
    underperforming high level product?
  • Permanent NASA Office on Calibration and
    Validation

31
Status of ASTER Data Products
Write a Comment
User Comments (0)
About PowerShow.com