Scientific Treatment of Uncertainty in Environmental Models - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Scientific Treatment of Uncertainty in Environmental Models

Description:

EPA's Science Advisory Board issued general on model review in 1989. The board recommended that a model's ... Water Analysis Simulation Program (WASP) ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 27
Provided by: kenre
Category:

less

Transcript and Presenter's Notes

Title: Scientific Treatment of Uncertainty in Environmental Models


1
Scientific Treatment of Uncertainty in
Environmental Models
  • Kenneth H. Reckhow
  • Nicholas School of the Environment Earth
    Sciences
  • Duke University

2
EPA's Science Advisory Board issued general on
model review in 1989. The board recommended that
a model's predictive capability could be enhanced
through (1) obtaining external stakeholder
input (2) documenting the model's explicit and
implicit assumptions (3) performing sensitivity
analyses (4) testing model predictions against
laboratory and field data and (5) conducting
peer reviews.
A critical issue associated with the use of
models in environmental decision making is the
analysis and communication of uncertainty.
3
Options for Assessing Model Prediction Uncertainty
4
Evaluation of Environmental Simulation
Models Used to Support Decision Making
Model Testing Using Prediction Observation
Comparison
When possible, the standard practice is to split
available data into two data sets use one data
set for model calibration (determination of the
models parameters) and use the second data set
to test (verify) the model.
The justification for this approach the second
data set is thought to challenge the model with
a new set of conditions, different from those
leading to the calibration data.
5
Evaluation of Environmental Simulation
Models Used to Support Decision Making
Model Testing Using Prediction Observation
Comparison
If, by chance, the calibration data are
essentially identical to the testing
(verification) data, then the evaluation may be
trivial.
One way to assess the rigor of the split data set
prediction observation comparison is to
statistically test the difference between the two
data sets. This test should focus on the features
of the model that are expected to change.
6
Evaluation of Environmental Simulation
Models Used to Support Decision Making
Error Propagation
Error terms are estimated for all uncertainties
in a model, and then propagated through the
model, often using Monte simulation. The result
is typically a probability distribution, on model
forecasts, reflecting all uncertainties.
Practical problems in estimating certain
important error terms - such as parameter
covariances, and model/equation error - plus
computational complexities for large models, have
limited the use of this technique.
7
Evaluation of Environmental Simulation
Models Used to Support Decision Making
Model Verification
For model verification using split data sets, the
prediction-observation comparison leads to an
hypothesis test, rather than computation of a
prediction error term.
Verified models could be identified and
endorsed by the agency. In that case, the risk
associated with application of a particular model
is assessed by the agency and not by the model
user (unless the agency is the user).
8
Evaluating Models for Decision Support An Example
Background
Under the Clean Water Act, surface water quality
standard violations require assessment of the
allowable pollutant input (the TMDL) to achieve
compliance with the standard.
In the Neuse Estuary in North Carolina, violation
of the chlorophyll a standard led to the
requirement that a TMDL be developed for nitrogen.
Three simulation models were applied to this
task this provided a valuable opportunity for
model evaluation and comparison.
9
Neuse TMDL Models
  • Neuse Estuary Bayesian Ecological Response Model
    (Neu-BERN)
  • Probability network model developed at
    Duke University
  • Neuse Estuary Eutrophication Model (NEEM)
  • 2-D dynamic simulation model (CE-Qual-W2)
    developed at UNC-Charlotte
  • Water Analysis Simulation Program (WASP)
  • 3-D dynamic simulation model applied by EPA-
    Region IV and TetraTech

10
(No Transcript)
11
The Negative Effects of Excessive Nitrogen
in an Estuary
Nitrogen stimulates the growth of algae.
Algae die and accumulate on the bottom where they
are consumed by bacteria.
Fish and shellfish may die or become weakened and
vulnerable to disease.
Under calm wind conditions, density
stratification occurs.
Oxygen is depleted in the bottom water.
12
(No Transcript)
13
p(Fish Health Poor N inputs X)
14
(No Transcript)
15
Example of Risk Assessment
90 Risk of Exceedance
16
Example of Risk Assessment
90 Risk of Exceedance
50 Risk of Exceedance
17
Neuse Estuary EutrophicationModel
18
(No Transcript)
19
(No Transcript)
20
Model Evaluation Exercise
  • Calibrate each model through 1999, then provide
    chlorophyll predictions for 2000.
  • Results
  • Section-to-section variability captured
    reasonably well.
  • Within section variability not captured.
  • Journal Water Resources Planning Management
    (2003)
  • www2.ncsu.edu/ncsu/CIL/WRRI/kens_page.html

21
(No Transcript)
22
(No Transcript)
23
What did we learn from this exercise?
As a result TMDL prediction uncertainty is high
24
What should be done to improve TMDL assessment?
  • TMDLs should be adaptive
  • Modeling approaches need to be developed that
    integrate model forecasts with post-implementation
    monitoring (e.g., Bayesian analysis, Kalman
    filter, data assimilation).

25
Adaptive Implementation Bayesian Analysis
Criterion Concentration
26
Evaluating Regulatory Environmental Models Some
Considerations
  • Model testing data should be distinctly
    different from model fitting data.
  • Model evaluation data should rigorously test the
    features of the model that are important in the
    application of interest.
  • Models rich in theory and process
    characterization may not be the best choices for
    regulatory support.
  • A national registry of approved/verified models
    may be appropriate in some situations, but not in
    others, as this subsumes the decision on
    acceptable risk.
  • A Bayesian approach for combining quantitative
    evidence (e.g., error statistics, peer reviews,
    previous applications) in support of a model may
    provide needed flexibility.
Write a Comment
User Comments (0)
About PowerShow.com