Title: Shahid HabibGSFC
1Applications Implementation Working Group
Second Meeting _at_ SSC March 31-April 1, 2004
2Goals--Continuing
- Have everyone integrated
- Centers/Projects
- Program Executives
- External Entities (grants etc. as applicable)
- Have everyone well communicated
- Information is compiled and readily available
(Program plans, budgetary data, E VV Reports,
etc.) - Schedule
- Program Strengths are regularly surfaced and
marketed - Program weaknesses are highlighted and corrective
actions initiated timely - Common basis are established and we all know
what are the defining requirements, constraints,
implementing processes, and deliverables
3AGENDA
Day 1 Morning session 830 am
1200pm Welcome Peterson HQ
Inputs/Update Frederick Goals of this
Meeting Habib et. al. Open Discussion -
State of the Applications Union All Update
from last meeting Actions/Issues/Concerns Budget/E
nvironment Latest on Program/Project
Plans Upcoming meetings/Conferences Applications
Showcase Other topics Lunch 1200 pm 100
pm Afternoon session 100pm 430 pm Upcoming
Solicitations and Expectations Frederick
Evaluation, VV and Benchmarking
Template, Zanoni/Stover And Reports/Deliverable
s (Issue 2) ESE Knowledgebase Policelli Pla
n for Master Schedule (issue 4) Policelli/Ha
bib AIWG Website Hood IBPD Metric Update
(Issue 2) Peterson Day 2 Morning
Session 800 am 1200 pm GIO Update Bambac
us PART Discussion and Improvement
Plan Stover Working session on Establishing a
Comprehensive
Process/Requirements for
Conducting Reviews of
each
Application Habib/Stover Lunch 1200 pm
100 pm Afternoon Session 100pm 400
pm REASON Overview Peterson et. al. U. of
Missouri and Arizona Activities Overview
(PPA) Kaupp/Hutchinson Benchmarking Tool
Overview Tralli Wrap up, Actions Items and
Adjourn Habib
4ISSUES from February 3-4, 2004 Meeting
- 1. Center roles (and capabilities)
- - Program vs. project
- - Expertise and interfaces (who?)
- - system engineering (SSC as a resource)
- - technical interface
- - Collaboration vs. competition
- - Better communications cross-center, to from
HQ, and otherwise - 2. Defining processes to be used
- - Reports, other deliverables
- - Outcomes, products should be recognizable,
understandable, flexible - - Need a catalog system, a central retrieval
repository - - Common processes for consistency, but flexible
- - Define minimum required documents
- - This discussion (about processes) includes
implementation, not just documentation - - Concern expressed about imposing inappropriate
standards (e.g. 7120.5b) - - Where do program planning analysis activities
now go? - 3. IBPD Metrics
- - Need process for including inputs from
community in formulating metrics - - Identify owner for each metric
5ACTIONS from February 3-4, 2004 Meeting
1. Create a Central Repository (preferrably
on-line) for Applications reports investigate
using Earth Science Network site- SSC 2. Clarify
new NASA policy with regard to center-to-center
suballocations- Greg Stover 3. Distribute
Knowledge Base CD to working group- Fritz
Policelli 4. Draft content/ outline for
Evaluation Reports, Verification and Validation
Reports, Benchmark Reports- SSC 5. HQ Guidance
for Applications acquisition/ use of aircraft
data Steve Ambrose 6. Review "working list" of
NASA contacts for G.I.O. - Myra Bambacus 7.
Define what level of science product validation
is required for NASA products to be used in
Applications Projects- Shahid Habib/ Marty
Frederick8. Provide copy or link to latest IBPD
to working group - Craig Peterson 9. Provide
IBPD responsibilities matrix to working group -
Fritz Policelli10. Provide Issues summary to
working group - Tim Miller 11. Define/ clarify
the phrase "types of predictions" as used in the
IBPD metrics - AIWG12.Begin drafting FY06
metrics (and communicate schedule/ due
date)-AIWG13.Set up next working group telecon -
Shahid Habib 14,Host next working group work shop
- SSC
6Program Documentation Pedigree
OMB
NASA Strategic Plan
Agency
IBPD
PART
ESE Strategic Plan
Enterprise
Earth Science Technology Plan
Earth Science Outreach Plan
Data Distribution and Archive Plan
Earth Science Applications Plan
Earth Science Education Plan
Earth Science Research Plan
Strategic
Division
Programmatic
Centers
Cross Cut Solutions Project Plan
Agriculture Project Plan
Energy Project Plan
Air quality Project Plan
Aviation Project Plan
Health Project Plan
Other Project Plan
Health Project Plan
Health Project Plan
Health Project Plan
Health Project Plan
Health Project Plan
Cross Cut Solutions Project Plan
Other Project Plan
Other Project Plan
Project
Center Implementation Plan
GSFC
Center Implementation Plan
Center Implementation Plan
JPL
Center Implementation Plan
LaRC
MSFC
7Relationship Donut
External Contracts
MSFC
Earmarks
Grants
ARC
JPL
SSC
LaRC
GSFC
8Program Alignment and Communication Flow
Program Exec/HQ Level
Agriculture
Coastal
Aviation
HLS
Invasive
H2O
GIO
Ecological
PH
AQ
Carbon
Energy
Disaster
GSFC
?
SSC
SSC
?
SSC
SSC
?
?
SSC
?
?
SSC
Deputy Program Exec Level
GSFC
?
LARC
?
?
LaRC
LaRC
?
GSFC
?
ARC
GSFC
Project Level
GSFC
Sub-Project Level
Strong Coupling
9Initial Systems Engineering Process
Define Requirements Specifications
DSS Selection
Investigate Alternative NASA Inputs
Design Implement
Benchmark (Baseline Assess Performance)
Enhanced DSS
Verify Validate
Refine
Refine
Refine
Refine
Refine
Evaluation
VV
Benchmarking
- Use of Systems Engineering principles leads to
scalable, systemic, and sustainable solutions and
processes, which in turn contribute to the
success of the mission, goals, and objectives of
each National Application. - Evaluation phase involves understanding the
requirements for and technical feasibility of
Earth science and remote sensing tools and
methods for addressing DSS needs.
- Verification and Validation (VV) phase includes
measuring the performance characteristics of
data, information, technologies, and/or methods,
and assessing the ability of these tools to meet
the requirements of the DSS. - In the Benchmarking phase, the adoption of NASA
inputs within an operational DSS and the
resulting impacts and outcomes are documented.
10DSS -- Status
11Review Contents
- Program Plans Guidance and Requirements
- Project Plans -- Commitment
- Team Composition
- Budget and Schedule
- Partnership Agreements
- DSS Selection and Status
- Task/Project Requirements
- NASA data/products used (missions, models, etc.)
- Technical Approach in solving the problem(s)
- Specific products produced
- Deliverables (IBPD and Projects)
- Risks
- Outreach (Publications, conferences, etc.)
- Benefits
12Review Process
- Cyclic (every 6 months) for each Applications
- Comprehensive
- Partners should be involved
- Evaluate progress to date
- Identify Tall Poles
- Recommendations for next steps