Title: HDF, EOSDIS, NASA ESE Data Standards
1HDF, EOSDIS, NASA ESE Data Standards
2Agenda
- ESDIS Status wrt HDF
- EOSDIS (American Customer Satisfaction Index)
- NASA Earth Science Standards Endorsement Process
3ESDIS Status
- Launch of Aura (July 25) marks end of development
phase of the EOSDIS Core System (ECS). - System is now in maintenance. Capability
refinements are under the Synergy program. - Data enters are now running Synergy 3 release.
Will be transitioning to Synergy 4 over the
next six months. - Maintenance of HDF for EOS includes two
components - Support of NCSAs HDF group through a
cooperative agreement. - Support of HDF-EOS through ECS maintenance
contract - Other ESDIS project sponsored HDF-related work
will be phased out near the end of calendar year
2004. - http//hdfeos.gsfc.nasa.gov website updates
- SESDA hdf data usability task
- Coordination, outreach and test bed development
for HDF integration through CEOS, OGC, ISO
organizations.
4HDF-EOS
- A profile, convention, convenience API, etc for
NASAs Earth Observation System standard data
products. - Defines structures for Point, Swath, Grid
(Atmospheric Profile, Zonal Table) - Defines specific location for product metadata
- ODL encoded metadata compliant with FGDC content
standards. - Maintained by a by L3-Communications under
subcontract to Raytheons ECS Maintenance and
Development contract. - Next release expected Dec. 2004
- HDF5-1.6.3
- SZIP 1.2
- New inquiry functions
- CEA (Cylindrical Equal Area grid projection
- Improved performance in read/write functions
5HDF in NASA Earth Remote Sensing
- HDF-EOS is format for EOS Standard Products
- Landsat 7 (ETM)
- Terra (CERES, MISR, MODIS, ASTER, MOPITT)
- Meteor-3M (SAGE III)
- Aqua (AIRS, AMSU-A, AMSR-E, CERES, MODIS)
- Aura(MLS, TES, HIRDLS, OMI
- HDF is used by other EOS missions
- OrbView 2 (SeaWIFS)
- TRMM (CERES, VIRS, TMI, PR)
- Quickscat (SeaWinds)
- EO-1 (Hyperion, ALI)
- ICESat (GLAS)
- Calypso
- Over 3 petabytes of EOSDIS archived data
6HDF-EOS Lessons
- Definition of a set of data structures as a
profile is not sufficient to guarantee
interoperability. - Also need definition of content, especially
metadata - this is increasingly difficult the
wider the disciplines covered. - See AURA DSWG standards and NetCDF CF as
examples. - Also need conformance measures - no spec is so
clear that it cannot be misinterpreted. - Even during life of mission, there must be
allowance for technology refresh. - Technology advances affect user expectations.
- Well understood concept for hardware -
traditionally less recognized for science
software and data products. - See OAIS
7Discussion topics today
- Ask the experts
- A growing number of software products depend upon
the HDF libraries. Are there suggestions for how
to better coordinate HDF library releases. - Questions from participants.
- HDF-GEO?
- Last workshop there was strong opinion expressed
that there should be some kind of bridge among
HDF geographic and geophysical profiles. - Can we develop a better sense of what such and
HDF-GEO might be? - Is this the list? HDF-EOS, NetCDF API, HDF-NPOESS
- What are reasonable expectations for this effort?
8From ESDSWG meeting last week Why Use a
Standard?
- Good documentation
- Other projects have reviewed it and found it
useful - Reusable software sometimes available
- Potential users can see that standard and
software works - Not management pressure or peer pressure just
more practical
92004 EOSDIS Satisfaction Survey
102004 EOSDIS Satisfaction Survey
- A measure of customer satisfaction
- ESISS and ESSAAC have recommended that NASA focus
on measuring the impact of our systems and
services rather than just the output - In 2004, NASA used a comprehensive survey to
determine the American Customer Satisfaction
Index (ACSI) for EOSDIS products and services. - ACSI provides a normalized measure of customer
satisfaction that allows benchmarking against
similar companies and industries. - 2004 survey results show that customer
satisfaction with EOSDIS compares very favorably
with both industry and other government agencies.
11Snapshot of the American Customer Satisfaction
Index (ACSI)
- The 1 national indicator of customer
satisfaction today - Compiled by the National Quality Research
Institute at the University of Michigan using
methodology licensed from the Claes Fornell
International (CFI) Group - Measures 40 industries and 200 organizations
covering 75 of the U.S. Economy - Over 70 U.S. Federal Government agencies have
used ACSI to measure more than 120
programs/services - CFIs Advanced methodology quantifiably measures
and links satisfaction levels to performance and
prioritizes actions for improvement
12Survey Background
- EOSDIS survey was performed by CFI Group through
a contract with the Federal Consulting Group
(Department of Treasury). - Survey questions developed by the DAAC User
Services Working Group were tailored to fit the
CFI methodology - ESDIS provided the CFI Group with 33,251 email
addresses from users who had used NASA/EOSDIS
products - CFI sent invitations to participate in an online
survey to 9,999 randomly selected users - 1,056 responses were completed
- 1,016 surveys were used in the analysis (250
responses were needed for statistically
meaningful response).
13EOSDIS Results
- The Customer Satisfaction Index for NASA EOSDIS
is - The Customer Satisfaction Index score is derived
from customer responses to three questions in the
survey - How satisfied are you overall with the products
and services provided by the Data Center (79)? - To what extent have the data, products and
services provided by the Data Center fallen short
of or exceeded your expectations (73)? - How well does the Data Center compare with an
ideal provider of scientific data, products and
services (71)? - This score is four points higher than the 2003
American Customer Satisfaction Index for the
Federal Government overall (71). - The confidence interval for ACSI is /-1.1 for
the aggregate at the 95 confidence level.
75
NASA EOSDIS Aggregate Segment
14Score ComparisonCurrent Location
74
ACSI
76
88
Customer Support
82
85
Delivery
83
USA (n478)
72
Product Selection
Outside
and Order
the USA (n577)
73
69
Product Search
71
67
Product Quality
69
34
Complaints
31
15Customer Support - Score 84, Impact 1.0
Customer Support
84
CFI considers EOSDIS to be World Class in the
area of customer support.
87
Professionalism
Technical knowledge
85
Accuracy of information provided
85
Helpfulness in selecting/finding data or
84
products
Helpfulness in correcting a problem
83
82
Timeliness of response
16Product Quality - Score 68, Impact 0.9
17Analysis of Results
- Product quality is the lowest scoring component
(68), and has a relatively high impact (0.9). - All attributes in this area received similar
ratings - At 84 customer support scores well, and is also
high impact (1.0). - There is a significant difference in customer
support ratings given by customers within the
U.S. (88) compared to those outside the U.S.
(82). - The components product search, product selection
and order are highly correlated. - Recent customers are more satisfied, but are also
reporting more problems. - Percentage of customer complaints is fairly high
(32) when compared to the federal government
overall (12). - Customers may not be calling to complain about a
problem, but rather to seek assistance in solving
the problem. - 90 of respondents who answered the customer
complaint questions gave user services complaint
handling a rating of 6 or above.
18CFIs Recommendations for Improving ACSI
- Focus on Product Quality
- Review the type of data product documentation
available with each product. Work to improve the
clarity and thoroughness of the documentation. - Assess the various data formats and work to
improve the usability of each. - Offer a wider variety of data formats.
- Review the Product Search and Product Selection
and Order scores to determine how best to help
customers find the data they need - Due to high correlation, improvements in one area
will likely result in improvements in the other. - Simplify the search process make data products
more apparent. - Improve data product descriptions.
19Product Format Ease of Use Comparison
HDF-EOS HDF Geo-TIFF Binary ASCII
Valid Responses 270 190 61 53 44
Mean Valid Score 6.76 7.20 7.48 7.02 7.30
Median Valid Score 7 8 8 7 8
Standard Deviation 2.47 2.34 2.03 2.76 2.54
95 Confidence Interval 0.29 0.33 0.51 0.74 0.75
of Users Assigning 8 or More 46.7 52.6 55.8 49.0 63.7
The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments. The relatively low scoring of HDF-EOS was supported by users free text comments.
20NASAs Earth Science Data SystemsStandards
Process
21Insights
- Interoperability does not require homogeneous
systems, but rather coordination at the
interfaces. - Management can judge success based upon program
goals rather than dictate solutions. - example degree of interoperability rather than
use of particular data format. - Communities of practice have solutions.
- Published practices that demonstrate benefit can
grow - successful practice in specific community
- broader community adoption
- community-recognized standards
22The ESDSWG Standards Process
- Modeled on Internet Engineering Task Force RFC
process and tailored to meet NASAs
circumstances. The standards process provides - Registers community practice for NASA
- NASA Earth science data management can rely on
standards to achieve highest priority
interoperability - Encourages consensus within communities
- Science investigators are assured that standards
contribute to science success in their
discipline. - Grows use of common practices among related
activities - Discipline communities benefit from the expertise
gained by others - Documents data systems practices for use by
external communities. - Lowers barriers to entry and use of NASA data.
23Standards Process Group Strategy
- Adopt standards at the interfaces, appropriate to
given science and drawn from successful practice. - Find specifications with a potentially wide
appeal - Draw attention to a much broader audience
- Monitor use, promote what works well
- Result Accelerate the evolution and adoption
- Preferred source of RFC is community nomination.
- Possible to direct creation of RFC in response to
identified needs. - Consequence of endorsement
- Future NASA data systems component proposals will
be judged partly on how well they interoperate
using community-identified practices or else
justify why departure from community has greater
benefit.
24Three Step Standards Process
25SPG Review
26Whats in the works
- DAP 2 standard used by many in the
oceanographic community basis for the DODS and
OpenDAP servers. -- submitted in June as a
Community Standard - Request For Comments on implementation
experience distributed October 1, comments due
November 12. - Precipitation Community discussing potential
science content standards being used to define
level 2 level 3 data - Self identified group of precipitation scientists
have identified need and are proposing a draft.
Are discussing at IPWG in Monterey. - The community is establishing de facto standards
in this area and that is the best way to deal
with this. - FGDC Vegetation Index standard discussing with
potential community members
27Ideas from the last ES-DSWG
- GCMD DIF
- GeoTIFF
- NetCDF CF
- OGC suite
28Community Leadership
- Strong proposals will have
- Leadership to support and use standard
- Potential for impact
- Potential for approval
- Simple standard is better
- Potential for spillover to other communities
- Successful RFCs will have
- At least two implementers
- Demonstrated operational benefit
- Leadership in generating the RFC
- Community willing/able to review
29SPG Contacts
- Earth Science Data Systems Standards Process
Group - http//spg.gsfc.nasa.gov/spg
- Chairs SPG
- Richard Ullman richard.ullman_at_nasa.gov
- Ming-Hsiang Tsou mtsou_at_mail.sdsu.edu