NASA

1 / 37
About This Presentation
Title:

NASA

Description:

PORTO ALEGRE. NASA. Earth Observing System Data and Information Systems ... NASA Langley Atmospheric Science Data Center (LaRC) DAAC ... – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 38
Provided by: esdisEos

less

Transcript and Presenter's Notes

Title: NASA


1
NASA
Earth Observing System Data and Information
Systems Customer Satisfaction Results November
1, 2006
2
Todays Discussion
  • Background
  • Overview Key Results
  • Detailed Analysis
  • Summary
  •  

3
Background
4
Project BackgroundObjectives
  • Measure customer satisfaction with the NASA Earth
    Observing System Data and Information System at a
    national level and for each Data Center
  • Alaska Satellite Facility (ASF)
  • Goddard Space Flight Center Earth Sciences
    Distributed Active Archive Center (GES DAAC)
  • Global Hydrology Resource Center (GHRC)
  • NASA Langley Atmospheric Science Data Center
    (LaRC) DAAC
  • Land Processes Distributed Active Archive Center
    (LP DAAC)
  • National Snow and Ice Data Center (NSIDC)
  • Oak Ridge National Laboratory Distributed Active
    Archive Center (ORNL DAAC)
  • Physical Oceanography Distributed Active Archive
    Center (PO DAAC)
  • Socioeconomic Data and Applications Center
    (SEDAC)
  • Assess the satisfaction with NASA EOSDIS
    specifically in the following key areas
  • Product Search
  • Product Selection and Order
  • Delivery
  • Product Quality and Documentation
  • Customer Support
  • Identify the key areas that NASA can leverage
    across the Data Centers to better service its
    users

5
Project BackgroundMeasurement timetable
6
Project BackgroundData collection
  • Respondents
  • A total of 2,857 responses were received
  •  

7
Project BackgroundRespondent information
Q8. For which disciplines do you need or use
Earth science data? (n2,857)
Multi-select
8
Project BackgroundRespondent information
Demographics remain fairly consistent with 2005.
Multi-select
9
Overview Key Results
10
NASA EOSDISCustomer satisfaction results
2006
2005
Overall satisfaction with the products and
services provided by the Data Center
How well the products and services provided by
the Data Center meet expectations
ATTRIBUTES
How well does the Data Center compares with an
ideal provider of scientific data, products and
services
11

NASA EOSDIS Benchmarks Continues to score well
Generally pleased with the system. Given the
variety and volume of data it is hard to imagine
doing much better. Your services are excellent
to the betterment for the world community as a
whole.
12
NASA EOSDIS ModelCustomer Support and Product
Search/Selection most critical
86
Recommend
74
Customer Satisfaction Index
3.5
88
Future Use
2.9
Sample Size 2,857
Scores
The performance of each component on a 0 to 100
scale. Component scores are made up of the
weighted average of the corresponding survey
questions.
Impacts
The change in target variable that results from a
five point change in a component score. For
example, a 5-point gain in Product Search would
yield a 0.9-point improvement in Satisfaction.
13
NASA EOSDIS 2006 vs. 2005 Significant declines
from 2005
Significant Difference vs. 2005
14
Areas of Opportunity for NASA EOSDIS Focus on the
search, selection and ordering for improvement
Also important to keep an eye on Customer Support
(82). High impact area further declines will
affect overall customer satisfaction.
15
Detailed Analysis
16
Score ComparisonHigher satisfaction persists
outside of the USA
64 of respondents are outside of the USA in 2006
vs. 67 in 2005.
Respondents outside the USA continue to have a
higher overall Satisfaction score with EOSDIS
(79 outside vs. 75 USA in 2005), though gap has
lessened.
17
CSI by Data CentersAll Data Centers register
declines in satisfaction
n45
n29
n223
n96
n808
n359
n305
n138
n273
n92
n909
n419
n165
n65
n51
n30
n78
n35
Significant Difference vs. 2005
18
Product SearchKey driver of satisfaction
65 used EOS Data Gateway to search for data and
products (63 in 2005)
What I would like to see is simple more dynamic
data search and order with plenty of (popular)
pre-order preprocessing capabilities. Initially
, I did not find the data search tools very
intuitive and I did not find the explanation of
how to carry out a search very helpful. However,
I persevered and now find the search tool easy to
use.
Impact0.9
Note All score decreases are statistically sig.
19
Product Search Score ComparisonBy method for
most recent search
Q12. How did you search for the data products or
services you were seeking? (n2,857)
111 indicated other 93 said direct interaction
(did not rate product search questions)
n191
n104
n534
n225
n64
n33
n1,864
n802
20
Product Search Scores by Data Center
Significant Difference vs. 2005
21
Product Selection and Order Also a top
opportunity for improvement
Q16. Please think about your most recent
request/order/download from the Data Center. Did
you use a subsetting tool? (n2,857) 30 said
No, 59 said Yes, by geographic area and 11 said
Yes, by geophysical parameter.
All data should be available free of cost for
scientific research.
92 said that they are finding what they want in
terms of type, format, time series, etc.
Impact0.7
Note All score decreases are statistically sig.
22
Product Selection and Order Scores by Data Center
Significant Difference vs. 2005
23
Customer SupportWhile high scoring, keep a close
eye on
Q37. Did you request assistance from the Data
Centers user services staff during your most
recent search or order? (n2,857) Yes18, No82
Only 59 were aware that the Data Center has a
user services office for assistance with placing
orders
86 were able to get help on first request. These
respondents have a significantly higher CSI (79)
than those who did not (59).
Impact1.6
Significant Difference vs. 2005
24
DeliveryMethods for receiving
65 said FTP was their preferred method in 2005
22 said download from web.
How long did it take to receive your data
products? 22 immediate retrieve 32 less than a
day (44 in 2005) 26 1-2 days (30 in 2005) 12
3-7 days (15 in 2005) 5 8-14 days (6 in
2005) 3 more than 14 days (5 in 2005)
25
Delivery
62 said their data came from MODIS 30 said
ASTER
Impact0.5
Note All score decreases are statistically sig.
26
Product DocumentationVisibility of documentation
is important
Q33. Was the documentation (n2,844) ...
Delivered with the data (18 vs. 28 in 05),
Available online (70 vs. 63 in 05), Not found
(12 vs. 9 in 05).
CSI for those whose documentation was not found
is 62 vs. those who got it delivered with the
data (76) or online (75).
What documentation did you use or were you
looking for? Data product description
68 Product format 58 Science algorithm
44 Instrument specifications 39 Tools
36 Science Applications 33 Production code 12
More clear and comprehensive documentation about
the data.
Impact0.4
Note Questions reworded, not comparable with 2005
Multi-select
27
Product QualityPreferences in line with actual
for the most part
In 2005, 9 said products were provided in
GeoTIFF and 25 who said it was their preferred
method.
28
Product Quality
Create tutorials that would make it easier for
non-experts to use the data in different
applications.


Impact0.1
Attributes not included in model
29
Summary
30
Summary
  • While lower than last year, NASA EOSDIS still
    performs on par or better than government
    benchmarks
  • Product Search, Selection and Order continue to
    be the top opportunities for improvement
  • Utilize additional research (i.e., focus groups,
    usability studies) to understand/address
    challenges with finding data and search
    capabilities
  • Communicate reasons for charging for data
  • While Customer Support scores well, further
    declines will affect satisfaction. Monitor this
    area.
  • High impact for those who require it
  • Most Data Centers with large number of
    respondents show significant declines (e.g., PO
    DAAC, GES DAAC, LP DAAC, ORNL DAAC)
  • Focus improvement efforts here first

31
Appendix
32
The Math Behind the Numbers
A discussion for a later dateor following this
presentation for those who are interested.
33
A Note About Score Calculation
  • Attributes (questions on the survey) are
    typically answered on a 1-10 scale
  • Social science research shows 7-10 response
    categories are optimal
  • Customers are familiar with a 10 point scale
  • Before being reported, scores are transformed
    from a 1-10 to a 0-100 scale
  • The transformation is strictly algebraic e.g.
  • The 0-100 scale simplifies reporting
  • Often no need to report many, if any, decimal
    places
  • 0-100 scale is useful as a management tool

34
Deriving Impacts
  • Remember high school algebra? The general
    formula for a line is
  • y mx b
  • The basic idea is that x is a cause and y is an
    effect, and m represents the slope of the line
    summarizing the relationship between x y
  • CFI Group uses a sophisticated variation of the
    advanced statistical tool, Partial Least Squares
    (PLS) Regression, to determine impacts when many
    different causes (i.e., quality components)
    simultaneously effect an outcome (e.g., Customer
    Satisfaction)

35
Who is CFI Group?
  • Founded in 1988 headquartered in Ann Arbor,
    Michigan
  • Principals are among the world experts in
    measuring constituent/stakeholder satisfaction
    and how to improve it
  • 14 offices, 150 full-time consultants and
    researchers worldwide
  • The CFI System is the source for the American
    Customer Satisfaction Index (ACSI), a national
    measure of customer satisfaction compiled by the
    National Quality Research Center at the
    University of Michigan Business School

36
Unique Features of the American Customer
Satisfaction Index (ACSI)
  • The only uniform measure of customer satisfaction
    in the U.S. economy, covering sectors accounting
    for about 66 of GDP
  • Measures the quality of economic output on a
    quarterly basis complementary to productivity
    measures and indicative of consumer spending
  • Uses multiple-item indicators to assess drivers
    of satisfaction
  • Meets the objective of explaining desired
    outcomes
  • Allows for comparison across agencies
  • Illustrates how customer satisfaction is embedded
    in a system of cause and effect relationships

37
ACSI National, Sector and Industry ScoresQ3
2005 Q2 2006
ACSI 74.4
Accommodation Food Services 75.8
Information 68.6
Utilities 72.4
Finance Insurance 73.9
PublicAdministration/ Government 67.1
E-Business 76.5
E-Commerce 79.6
Retail Trade 72.4
Transportation Warehousing 72.3
Health Care Social Assistance 74.1
Manufacturing/Durable Goods 80.1
Manufacturing/Nondurable Goods 81.8
75 Hotels 77 Limited-ServiceRestaurants
63 Newspapers 73 Motion Pictures 69 Network/Cable
TV News 74 Computer Software 70 Fixed
LineTelephone Service 66 Wireless
TelephoneService 70 CellularTelephones 63 Cable
Satellite TV
72 EnergyUtilities
65.9 Local Government 71.3 Federal Government
73 News Information 76 Portals 79 Search
Engines
75 Banks 75 Life Insurance 68 Health Insurance 78
Property Casualty Insurance
74 Supermarkets 69 Gasoline Stations 75 Department
Discount Stores 74 Specialty Retail
Stores 76 Health Personal Care Stores
65 Airlines 71 U.S.Postal Service 83 ExpressDel
ivery
81 Retail 78 Auctions 76 Brokerage 77 Travel
74 Hospitals
77 Personal Computers 80Electronics(TV/VCR/DVD)
81 Major Appliances 81 Automobiles Light
Vehicles
82 Food Manufacturing 82 Pet Food 83 Soft
Drinks 82 Breweries 79 Cigarettes 81 Apparel 77 At
hletic Shoes 83 Personal Care CleaningProducts
Source www.theacsi.org
Write a Comment
User Comments (0)