Title: Scientific Treatment of Uncertainty in Environmental Models
1Scientific Treatment of Uncertainty in
Environmental Models
- Kenneth H. Reckhow
- Nicholas School of the Environment Earth
Sciences - Duke University
2EPA's Science Advisory Board issued general on
model review in 1989. The board recommended that
a model's predictive capability could be enhanced
through (1) obtaining external stakeholder
input (2) documenting the model's explicit and
implicit assumptions (3) performing sensitivity
analyses (4) testing model predictions against
laboratory and field data and (5) conducting
peer reviews.
A critical issue associated with the use of
models in environmental decision making is the
analysis and communication of uncertainty.
3Options for Assessing Model Prediction Uncertainty
4Evaluation of Environmental Simulation
Models Used to Support Decision Making
Model Testing Using Prediction Observation
Comparison
When possible, the standard practice is to split
available data into two data sets use one data
set for model calibration (determination of the
models parameters) and use the second data set
to test (verify) the model.
The justification for this approach the second
data set is thought to challenge the model with
a new set of conditions, different from those
leading to the calibration data.
5Evaluation of Environmental Simulation
Models Used to Support Decision Making
Model Testing Using Prediction Observation
Comparison
If, by chance, the calibration data are
essentially identical to the testing
(verification) data, then the evaluation may be
trivial.
One way to assess the rigor of the split data set
prediction observation comparison is to
statistically test the difference between the two
data sets. This test should focus on the features
of the model that are expected to change.
6Evaluation of Environmental Simulation
Models Used to Support Decision Making
Error Propagation
Error terms are estimated for all uncertainties
in a model, and then propagated through the
model, often using Monte simulation. The result
is typically a probability distribution, on model
forecasts, reflecting all uncertainties.
Practical problems in estimating certain
important error terms - such as parameter
covariances, and model/equation error - plus
computational complexities for large models, have
limited the use of this technique.
7Evaluation of Environmental Simulation
Models Used to Support Decision Making
Model Verification
For model verification using split data sets, the
prediction-observation comparison leads to an
hypothesis test, rather than computation of a
prediction error term.
Verified models could be identified and
endorsed by the agency. In that case, the risk
associated with application of a particular model
is assessed by the agency and not by the model
user (unless the agency is the user).
8Evaluating Models for Decision Support An Example
Background
Under the Clean Water Act, surface water quality
standard violations require assessment of the
allowable pollutant input (the TMDL) to achieve
compliance with the standard.
In the Neuse Estuary in North Carolina, violation
of the chlorophyll a standard led to the
requirement that a TMDL be developed for nitrogen.
Three simulation models were applied to this
task this provided a valuable opportunity for
model evaluation and comparison.
9Neuse TMDL Models
- Neuse Estuary Bayesian Ecological Response Model
(Neu-BERN) - Probability network model developed at
Duke University - Neuse Estuary Eutrophication Model (NEEM)
- 2-D dynamic simulation model (CE-Qual-W2)
developed at UNC-Charlotte - Water Analysis Simulation Program (WASP)
- 3-D dynamic simulation model applied by EPA-
Region IV and TetraTech
10(No Transcript)
11The Negative Effects of Excessive Nitrogen
in an Estuary
Nitrogen stimulates the growth of algae.
Algae die and accumulate on the bottom where they
are consumed by bacteria.
Fish and shellfish may die or become weakened and
vulnerable to disease.
Under calm wind conditions, density
stratification occurs.
Oxygen is depleted in the bottom water.
12(No Transcript)
13p(Fish Health Poor N inputs X)
14(No Transcript)
15Example of Risk Assessment
90 Risk of Exceedance
16Example of Risk Assessment
90 Risk of Exceedance
50 Risk of Exceedance
17Neuse Estuary EutrophicationModel
18(No Transcript)
19(No Transcript)
20Model Evaluation Exercise
- Calibrate each model through 1999, then provide
chlorophyll predictions for 2000.
- Results
- Section-to-section variability captured
reasonably well. - Within section variability not captured.
- Journal Water Resources Planning Management
(2003) - www2.ncsu.edu/ncsu/CIL/WRRI/kens_page.html
21(No Transcript)
22(No Transcript)
23What did we learn from this exercise?
As a result TMDL prediction uncertainty is high
24What should be done to improve TMDL assessment?
- TMDLs should be adaptive
- Modeling approaches need to be developed that
integrate model forecasts with post-implementation
monitoring (e.g., Bayesian analysis, Kalman
filter, data assimilation).
25Adaptive Implementation Bayesian Analysis
Criterion Concentration
26Evaluating Regulatory Environmental Models Some
Considerations
- Model testing data should be distinctly
different from model fitting data.
- Model evaluation data should rigorously test the
features of the model that are important in the
application of interest.
- Models rich in theory and process
characterization may not be the best choices for
regulatory support.
- A national registry of approved/verified models
may be appropriate in some situations, but not in
others, as this subsumes the decision on
acceptable risk.
- A Bayesian approach for combining quantitative
evidence (e.g., error statistics, peer reviews,
previous applications) in support of a model may
provide needed flexibility.