HDF, EOSDIS, NASA ESE Data Standards - PowerPoint PPT Presentation

About This Presentation
Title:

HDF, EOSDIS, NASA ESE Data Standards

Description:

Launch of Aura (July 25) marks end of development phase of ... Calypso. Over 3 petabytes of EOSDIS archived data. NATIONAL AERONAUTICS. AND SPACE ADMINISTRATION ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 30
Provided by: richard454
Learn more at: http://hdfeos.org
Category:
Tags: eosdis | ese | hdf | nasa | calypso | data | standards

less

Transcript and Presenter's Notes

Title: HDF, EOSDIS, NASA ESE Data Standards


1
HDF, EOSDIS, NASA ESE Data Standards
  • Richard Ullman

2
Agenda
  • ESDIS Status wrt HDF
  • EOSDIS (American Customer Satisfaction Index)
  • NASA Earth Science Standards Endorsement Process

3
ESDIS Status
  • Launch of Aura (July 25) marks end of development
    phase of the EOSDIS Core System (ECS).
  • System is now in maintenance. Capability
    refinements are under the Synergy program.
  • Data enters are now running Synergy 3 release.
    Will be transitioning to Synergy 4 over the
    next six months.
  • Maintenance of HDF for EOS includes two
    components
  • Support of NCSAs HDF group through a
    cooperative agreement.
  • Support of HDF-EOS through ECS maintenance
    contract
  • Other ESDIS project sponsored HDF-related work
    will be phased out near the end of calendar year
    2004.
  • http//hdfeos.gsfc.nasa.gov website updates
  • SESDA hdf data usability task
  • Coordination, outreach and test bed development
    for HDF integration through CEOS, OGC, ISO
    organizations.

4
HDF-EOS
  • A profile, convention, convenience API, etc for
    NASAs Earth Observation System standard data
    products.
  • Defines structures for Point, Swath, Grid
    (Atmospheric Profile, Zonal Table)
  • Defines specific location for product metadata
  • ODL encoded metadata compliant with FGDC content
    standards.
  • Maintained by a by L3-Communications under
    subcontract to Raytheons ECS Maintenance and
    Development contract.
  • Next release expected Dec. 2004
  • HDF5-1.6.3
  • SZIP 1.2
  • New inquiry functions
  • CEA (Cylindrical Equal Area grid projection
  • Improved performance in read/write functions 

5
HDF in NASA Earth Remote Sensing
  • HDF-EOS is format for EOS Standard Products
  • Landsat 7 (ETM)
  • Terra (CERES, MISR, MODIS, ASTER, MOPITT)
  • Meteor-3M (SAGE III)
  • Aqua (AIRS, AMSU-A, AMSR-E, CERES, MODIS)
  • Aura(MLS, TES, HIRDLS, OMI
  • HDF is used by other EOS missions
  • OrbView 2 (SeaWIFS)
  • TRMM (CERES, VIRS, TMI, PR)
  • Quickscat (SeaWinds)
  • EO-1 (Hyperion, ALI)
  • ICESat (GLAS)
  • Calypso
  • Over 3 petabytes of EOSDIS archived data

6
HDF-EOS Lessons
  • Definition of a set of data structures as a
    profile is not sufficient to guarantee
    interoperability.
  • Also need definition of content, especially
    metadata - this is increasingly difficult the
    wider the disciplines covered.
  • See AURA DSWG standards and NetCDF CF as
    examples.
  • Also need conformance measures - no spec is so
    clear that it cannot be misinterpreted.
  • Even during life of mission, there must be
    allowance for technology refresh.
  • Technology advances affect user expectations.
  • Well understood concept for hardware -
    traditionally less recognized for science
    software and data products.
  • See OAIS

7
Discussion topics today
  • Ask the experts
  • A growing number of software products depend upon
    the HDF libraries. Are there suggestions for how
    to better coordinate HDF library releases.
  • Questions from participants.
  • HDF-GEO?
  • Last workshop there was strong opinion expressed
    that there should be some kind of bridge among
    HDF geographic and geophysical profiles.
  • Can we develop a better sense of what such and
    HDF-GEO might be?
  • Is this the list? HDF-EOS, NetCDF API, HDF-NPOESS
  • What are reasonable expectations for this effort?

8
From ESDSWG meeting last week Why Use a
Standard?
  • Good documentation
  • Other projects have reviewed it and found it
    useful
  • Reusable software sometimes available
  • Potential users can see that standard and
    software works
  • Not management pressure or peer pressure just
    more practical

9
2004 EOSDIS Satisfaction Survey
10
2004 EOSDIS Satisfaction Survey
  • A measure of customer satisfaction
  • ESISS and ESSAAC have recommended that NASA focus
    on measuring the impact of our systems and
    services rather than just the output
  • In 2004, NASA used a comprehensive survey to
    determine the American Customer Satisfaction
    Index (ACSI) for EOSDIS products and services.
  • ACSI provides a normalized measure of customer
    satisfaction that allows benchmarking against
    similar companies and industries.
  • 2004 survey results show that customer
    satisfaction with EOSDIS compares very favorably
    with both industry and other government agencies.

11
Snapshot of the American Customer Satisfaction
Index (ACSI)
  • The 1 national indicator of customer
    satisfaction today
  • Compiled by the National Quality Research
    Institute at the University of Michigan using
    methodology licensed from the Claes Fornell
    International (CFI) Group
  • Measures 40 industries and 200 organizations
    covering 75 of the U.S. Economy
  • Over 70 U.S. Federal Government agencies have
    used ACSI to measure more than 120
    programs/services
  • CFIs Advanced methodology quantifiably measures
    and links satisfaction levels to performance and
    prioritizes actions for improvement

12
Survey Background
  • EOSDIS survey was performed by CFI Group through
    a contract with the Federal Consulting Group
    (Department of Treasury).
  • Survey questions developed by the DAAC User
    Services Working Group were tailored to fit the
    CFI methodology
  • ESDIS provided the CFI Group with 33,251 email
    addresses from users who had used NASA/EOSDIS
    products
  • CFI sent invitations to participate in an online
    survey to 9,999 randomly selected users
  • 1,056 responses were completed
  • 1,016 surveys were used in the analysis (250
    responses were needed for statistically
    meaningful response).

13
EOSDIS Results
  • The Customer Satisfaction Index for NASA EOSDIS
    is
  • The Customer Satisfaction Index score is derived
    from customer responses to three questions in the
    survey
  • How satisfied are you overall with the products
    and services provided by the Data Center (79)?
  • To what extent have the data, products and
    services provided by the Data Center fallen short
    of or exceeded your expectations (73)?
  • How well does the Data Center compare with an
    ideal provider of scientific data, products and
    services (71)?
  • This score is four points higher than the 2003
    American Customer Satisfaction Index for the
    Federal Government overall (71).
  • The confidence interval for ACSI is /-1.1 for
    the aggregate at the 95 confidence level.

75
NASA EOSDIS Aggregate Segment
14
Score ComparisonCurrent Location
74
ACSI
76
88
Customer Support
82
85
Delivery
83
USA (n478)
72
Product Selection
Outside
and Order
the USA (n577)
73
69
Product Search
71
67
Product Quality
69
34
Complaints
31
15
Customer Support - Score 84, Impact 1.0
Customer Support
84
CFI considers EOSDIS to be World Class in the
area of customer support.
87
Professionalism
Technical knowledge
85
Accuracy of information provided
85
Helpfulness in selecting/finding data or
84
products
Helpfulness in correcting a problem
83
82
Timeliness of response
16
Product Quality - Score 68, Impact 0.9
17
Analysis of Results
  • Product quality is the lowest scoring component
    (68), and has a relatively high impact (0.9).
  • All attributes in this area received similar
    ratings
  • At 84 customer support scores well, and is also
    high impact (1.0).
  • There is a significant difference in customer
    support ratings given by customers within the
    U.S. (88) compared to those outside the U.S.
    (82).
  • The components product search, product selection
    and order are highly correlated.
  • Recent customers are more satisfied, but are also
    reporting more problems.
  • Percentage of customer complaints is fairly high
    (32) when compared to the federal government
    overall (12).
  • Customers may not be calling to complain about a
    problem, but rather to seek assistance in solving
    the problem.
  • 90 of respondents who answered the customer
    complaint questions gave user services complaint
    handling a rating of 6 or above.

18
CFIs Recommendations for Improving ACSI
  • Focus on Product Quality
  • Review the type of data product documentation
    available with each product. Work to improve the
    clarity and thoroughness of the documentation.
  • Assess the various data formats and work to
    improve the usability of each.
  • Offer a wider variety of data formats.
  • Review the Product Search and Product Selection
    and Order scores to determine how best to help
    customers find the data they need
  • Due to high correlation, improvements in one area
    will likely result in improvements in the other.
  • Simplify the search process make data products
    more apparent.
  • Improve data product descriptions.

19
Product Format Ease of Use Comparison
HDF-EOS HDF Geo-TIFF Binary ASCII
Valid Responses 270 190 61 53 44
Mean Valid Score 6.76 7.20 7.48 7.02 7.30
Median Valid Score 7 8 8 7 8
Standard Deviation 2.47 2.34 2.03 2.76 2.54
95 Confidence Interval 0.29 0.33 0.51 0.74 0.75
of Users Assigning 8 or More 46.7 52.6 55.8 49.0 63.7
The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments.
20
NASAs Earth Science Data SystemsStandards
Process
21
Insights
  • Interoperability does not require homogeneous
    systems, but rather coordination at the
    interfaces.
  • Management can judge success based upon program
    goals rather than dictate solutions.
  • example degree of interoperability rather than
    use of particular data format.
  • Communities of practice have solutions.
  • Published practices that demonstrate benefit can
    grow
  • successful practice in specific community
  • broader community adoption
  • community-recognized standards

22
The ESDSWG Standards Process
  • Modeled on Internet Engineering Task Force RFC
    process and tailored to meet NASAs
    circumstances. The standards process provides
  • Registers community practice for NASA
  • NASA Earth science data management can rely on
    standards to achieve highest priority
    interoperability
  • Encourages consensus within communities
  • Science investigators are assured that standards
    contribute to science success in their
    discipline.
  • Grows use of common practices among related
    activities
  • Discipline communities benefit from the expertise
    gained by others
  • Documents data systems practices for use by
    external communities.
  • Lowers barriers to entry and use of NASA data.

23
Standards Process Group Strategy
  • Adopt standards at the interfaces, appropriate to
    given science and drawn from successful practice.
  • Find specifications with a potentially wide
    appeal
  • Draw attention to a much broader audience
  • Monitor use, promote what works well
  • Result Accelerate the evolution and adoption
  • Preferred source of RFC is community nomination.
  • Possible to direct creation of RFC in response to
    identified needs.
  • Consequence of endorsement
  • Future NASA data systems component proposals will
    be judged partly on how well they interoperate
    using community-identified practices or else
    justify why departure from community has greater
    benefit.

24
Three Step Standards Process
25
SPG Review
26
Whats in the works
  • DAP 2 standard used by many in the
    oceanographic community basis for the DODS and
    OpenDAP servers. -- submitted in June as a
    Community Standard
  • Request For Comments on implementation
    experience distributed October 1, comments due
    November 12.
  • Precipitation Community discussing potential
    science content standards being used to define
    level 2 level 3 data
  • Self identified group of precipitation scientists
    have identified need and are proposing a draft.
    Are discussing at IPWG in Monterey.
  • The community is establishing de facto standards
    in this area and that is the best way to deal
    with this.
  • FGDC Vegetation Index standard discussing with
    potential community members

27
Ideas from the last ES-DSWG
  • GCMD DIF
  • GeoTIFF
  • NetCDF CF
  • OGC suite

28
Community Leadership
  • Strong proposals will have
  • Leadership to support and use standard
  • Potential for impact
  • Potential for approval
  • Simple standard is better
  • Potential for spillover to other communities
  • Successful RFCs will have
  • At least two implementers
  • Demonstrated operational benefit
  • Leadership in generating the RFC
  • Community willing/able to review

29
SPG Contacts
  • Earth Science Data Systems Standards Process
    Group
  • http//spg.gsfc.nasa.gov/spg
  • Chairs SPG
  • Richard Ullman richard.ullman_at_nasa.gov
  • Ming-Hsiang Tsou mtsou_at_mail.sdsu.edu
Write a Comment
User Comments (0)
About PowerShow.com