Title: Working Group Outbrief Template
1Working Group Outbrief Template
- Modeling and Data Assimilation
- Philip Ardanuy, Joe Shannon, Bob Chen, Maria
Pirone, Eugenia Kalnay, and many others
2Modeling and Data Assimilation
Collect Obs
Process Products
Archive/ Manage Products
Transmit Telemetry
Conduct Primary Mission
Push Products
Information Creation (Supply Side)
Develop Algorithms
Employ R-to-O
Create Services (and, Enable the Creation of
Services)
Make Better Informed Decisions
Generate Predictions
Validate Models
Conduct Research
Develop Models
Public Domain/ Market- Driven
Stimulate Value-added Market
Develop DST
Employ R-to-O
Research/ Modeling
Stimulate Capacity Building
Access (Pull) Products
Discover Products
Exploit Decision Support Tools
Visualize Products
Impacts/ Outcomes
Information Exploitation (Demand Side)
3High Level Value Stream Template
Validate models
Identify quality improvements
Continuously improve service type
Research to Operational Transition
Employ repeatable SE processes
Exploit Decision Support Tools (Visualize results)
Assimilate Data (observa-tions, other data)
Assess Decision-Maker and Societal Needs
Develop Model
Generate Predictions/Scenarios
Create Services (and, Enable the Creation of
Services)
4IPO Chart Template
Value Stream Step Assess Decision-Maker and
Societal Needs
Requirements
Statement of need
Priorities
Desired P3I
1. Hold workshops/survey stakeholders 2.
Collect requirements 3. Understand objectives 4.
Determine feasibility 5. Account for budget and
schedule constraints 6. Develop use cases 7.
Perform market assessment
ConOps
Performance/product Feedback (ongoing)
Use Cases
Policy
Potential data sources
Current state of practice (legacy systems)
Feasibility/cost
Schedule
Copy as needed
5IPO Chart Template
Requirements
Value Stream Step Develop Model
Validated model
Priorities
- 1. Understand constraints
- 2. Assess and leverage existing models
- 3. Develop/refine/tailor algorithm theoretical
basis - 4. Prototyping/test bedding
- 5. Peer review
- 6. Test data set development
- 7. VV
- 8. User acceptance
- 9. Requirements updating/refinement
- Build intellectual capacity
Reproducible results (with uncertainties and
performance metrics)
ConOps
Infrastructure
Use Cases
Outreach to users (Model socialization)
Potential data sources
Schedule/
ICD, standards, formats (input data requirements)
Feasibility/cost
State of the science/technology
Documentation (incl. requirements updating)
ICD, standards, formats
Updated use cases
Copy as needed
6IPO Chart Template
Value Stream Step Assimilate Data
(observations, other data)
Validated model
Assimilated data set (best estimate of state and
evolution of the system
Reproducible results (with uncertainties and
performance metrics)
- Develop uncertainties (statistics of model and
data errors) - Acquire appropriately formatted data
- Preprocess, screen, and QC data
- Run model/first guess/initial state
- Optimally combine model and data to get best
estimate of the state of the system - (Ability and willingness to accommodate and
exploit new data and techniques)
Infrastructure
Uncertainties
Metadata (data history/provenance, and quality)
Outreach to users (Model socialization)
Performance metrics
ICD, standards, formats (input data requirements)
Documentation (incl. requirements updating)
Updated use cases (Scripts and user scenarios)
Copy as needed
7IPO Chart Template
Assimilated data set (best estimate of state and
evolution of the system)
Value Stream Step Generate Predictions/Scenarios
Prediction/scenarios of future system state
Uncertainties
- 1. Run the model/ensembles from assimilated
initial conditions - 2. Conversion of output into users desired
format - 3. Generate and understand performance metrics
- 4. Interpret the model output into/as a
prediction - 5. (Verify against truth, identify areas of
deficiency for possible improvement) - (Be prepared for periodic reanalysis when methods
have substantially improved) - Archiving of results and input data
- Publication and dissemination
Metadata (data history/provenance, and quality)
Metadata (data history/provenance, and quality)
Uncertainties
Performance metrics
Performance metrics
Model
Updated use cases (Scripts and user scenarios)
Notification
Model input/output
Parameters for prediction (boundary conditions,
forcing conditions, exogenous variables)
Long-term archive
Copy as needed
8IPO Chart Template
User Requirements
Value Stream Step Exploit Decision Support Tools
Data from other sources
Prediction/scenarios of future system state
Recommended actions/other actions
1. Ingest model and other-source data 2. Data
fusion 3. Execute user scenarios 4. Tailor
output to problem at hand 5. Visualize results 6.
Education and training 7. User advocacy and
customer intimacy 8. Publication and timely
dissemination to the public 9. Evaluate decision
impacts and value
Metadata (data history/provenance, and quality)
Alerts/messages
Uncertainty analysis
Uncertainties
Performance metrics
Lessons learned
Notification
Visualizations
Model input/output
Long-term archive
Copy as needed
9IPO Chart Template
Prediction/scenarios of future system state
Value Stream Step Create Services (and, Enable
the Creation of Services)
Metadata (data history/provenance, and quality)
Uncertainties
1. Identify the need 2. Perform market
assessment 3. Encourage and participate in
prototyping 4. Form/promote formation of
partnerships 5. Identify potential new uses
and users 6. Create and maintain long-term
archive 7. Advertise new capabilities 8.
Market the services
Advertisement of services
Performance metrics
Notification
Government-provided services
Model input/output
Recommended actions/other actions
Stimulate a market environment for value-added
services
Alerts/messages
Other data and assets
Market demands
Lessons learned
Visualizations
Copy as needed
10List of Key Use Cases
Copy as needed
11Use Case Template
Use Case Reanalysis Reanalysis/periodic
creation of long-term self-consistent records
- Description As science and method in models and
records data assimilation progress, it is
necessary to create new reanalysis reprocessing
all the observations. - Actors NOAA, academia, other agencies (NASA,
DOE, DOD), private companies. - Preconditions Archives of past information,
substantial improvements in science, computer
power, etc. - Flow of Events for Main Scenario
- Continue past re-analyses in real time to serve
current users. - Assess whether NWP science has advanced enough
(5-10 years) - Gather stakeholders (agencies, academia, users)
to define requirements. - Create and test new data assimilation system.
- Reprocess all the observations with the new
(frozen) reanalysis system- this could take up to
5 years. - Continue processing of new observations.
- 7. Repeat
- Alternate Scenario Continue using old reanalysis
even though they are outdated because they are
still useful - Post Conditions Disseminate widely and freely
(this was one of the two things that made the
NCEP reanalysis so successful the second was the
combination of reanalysis with real time CDAS - G. Special Conditions Ideally we should do
reanalysis of the coupled Earth System
(atmosphere, ocean, land) - regional
high-resolution reanalysis
12List of Relevant Standards
13Standards Relevance/Importance
14List of Relevant Constraints
15Constraints Relevance/Importance
16List of Relevant Issues
17Issues Relevance/Importance
18List of Relevant Enablers
19Enabler Relevance/Importance
20Domain Collaboration Diagram
User Community
Observational community
Modeling community
Cyber- infrastructure
21Domain Collaboration Diagram
User Community
Observational community
Modeling community
Cyber- infrastructure
22Domain Collaboration Diagram
User Community
Observational community
Modeling community
Cyber- infrastructure
Ignore cat