Title: Prediction of Computational Quality for Aerospace Applications
1Prediction of Computational Qualityfor Aerospace
Applications
- Michael J. Hemsch, James M. Luckring, Joseph H.
Morrison - NASA Langley Research Center
Elements of Predictability Workshop November
13-14, 2003 Johns Hopkins University
2Outline
- Breakdown of the problem (again) with a slight
twist. - The issue for most of aerospace is that
non-computationalists are doing the applications
computations. - What are they doing now? What can we do to help?
3Breakdown of tasks
Computation
Experimentation
Off-line
Off-line
Traceable operational definition of the process
Verifying that the coding is correct
Calibration of instruments
Traceability to standards
Off-line
Off-line
Random error characterization using
standard artifacts
Characterization of process variation
using standard problems
Measuring the measurement system
Measuring the computational process
Off-line
Off-line
Model-to-model and model-to-reality discrimination
Discrimination testing of the measurement system
Systematic error characterization
Systematic error characterization
QA checks against above measurements during
computation for customer
QA checks against above measurements during
customer testing
Process output of interest
Solution verification
4The key question for applications How is the
applications person going to convince the
decision maker that the computational process is
good enough?
5Our tentative answer based on observation of aero
engineers trying to use CFD on real-life design
problems is that it is the quantitative
explanatory force of any approach that creates
acceptance.
6- How can quantitative "explanatory force be
provided? - Breakdown to two questions
- How do I know that I am predicting the right
physics at the right place in the inference
space? - How accurate are my results if I do have the
right physics at the right place in the inference
space?
7Airfoil Stall Classification
8Boundaries Among Stall Types
9- The applications person needs a process that can
be - Controlled
- Evaluated
- Improved
(i.e. a predictable process)
10Creating a predictable process
Controllable input (assignable cause variation)
Predicted coefficients, flow features, etc.
Geometry, flight conditions, etc.
Process
Uncontrolled input from the environment (variation
that we have to live with, e.g. numerics,
parameter uncertainty, model form uncertainty,
users)
11Critical levels of attainment for a predictable
process
- A defined set of steps
- Stable and replicable
- Measurable
- Improvable
12What it takes to have an impact ...
- Historically, practitioners have created their
designs (and the disciplines they work in) with
very little reference to researchers. - Practitioners who are successfully using aero
computations already know what it takes to
convince a risk taker. - If we want to have an impact on practitioners, we
will have to build on what they are already
doing.
13What is takes to have an impact ...
- Good questions
- Are researchers going to be an integral part of
the applications uncertainty quantification
process or are we going to be irrelevant? - What specific impact on practitioners do I want
to have with a particular project? - What process/product improvement am I expecting
from that project?
14What is takes to have an impact ...
- We can greatly improve, systematize and
generalize the process that practitioners are
successfully using right now. - The key watchwords for applications are
- practicality, as in mission analysis and design
- alacrity, as in "I want to use it right now."
- impact, as in "Will my customer buy in?" and "Am
I willing to bet my career (and my life) on my
prediction?"
15Actions
- Establish working groups like the AIAA Drag
Prediction Workshop (DPW) - Select a small number of focus problems
- Use those problems
- to demonstrate the prediction uncertainty
strategies - to find out just how tough this problem really is
- For right now
- Run multiple codes, different grid types,
multiple models, etc. - Work data sets that fully capture the physics of
the application problem of interest. - Develop process best practices and find ways to
control and evaluate them. - Develop experiments to determine our ability to
predict uncertainty and to predict the domain
boundaries where the physics changes.
16Breakout Questions/Issues
- Defining predictability in the context of the
application - The logical or physical reasons for lack of
predictability - Possibility of isolating the reducible
uncertainties in view of dealing with them
(either propagating them or reducing them) - The role of experimental evidence in
understanding and controlling predictability - The possibility of gathering experimental
evidence - The role that modeling plays in limiting
predictability - Minimum requisite attributes of predictive models
- The role played by temporal and spatial scales
and possibilities mitigating actions and models