Title: Sausages, Laws and Paradigm Shifts
1Sausages, Laws and Paradigm Shifts
Making the Transition from Point Estimates to
Probabilistic Risk Assessments
- Workshop on Uncertainty Analysis and Management
- Johns Hopkins University, August 16-18, 1999
- Timothy M. Barry
- United States Environmental Protection Agency
2The EPA/NAS Model ...
Risk Assessment Paradigm
3Marketing a Paradigm Shift ...
- Probabilistic risk assessment (via Monte Carlo
Analysis) has been promoted on the basis of its
benefits . ? improved decision-making ?
more informative ? transparent ? more honest
4The best PRA is one that achieves three goals
- It should reveal, according to any criterion the
decision-maker selects, which of the available
decisions performs best. - It should point to priorities for obtaining new
information (if time and resources permit), so
that the decision-maker can reduce uncertainty
and increase the confidence in his choice. - It should spur on the search for new decisions
which may outperform any of the ones that the
initial QUA compared.
5Challenge to the Uncertainty Analyst
- The most important challenge facing the analyst
is to effectively communicate the insights an
uncertainty analysis provides to those who need
them.. an appreciation of the overall degree
of uncertainty about the conclusions.. an
understanding of the sources of uncertainty and
which modeling assumptions are critical to the
analysis and which are not.. an understanding
of the extent to which plausible alternative
assumptions can affect the conclusions
6However,
It is the perception of many that the driving
force behind the push to use Monte Carlo
analysis is the widely-held and mostly unproven
notion that EPA risk assessments are conservative
in the extreme, leading to regulations that are
too severe, cost too much, and provide too
little benefit.
7What does uncertainty mean to EPA decision-makers?
- Uncertainty about Adverse Effects ? how
confident are we that there are environmental
effects? ? what is the degree of consensus in
the scientific community? - Uncertainties about Exposures ? are/will
significant exposures really occur? ? what are
the "error bands"?
8More .
- Uncertainty about the Strength of the Data ?
Where are the data gaps? ? How significant are
these gaps to the overall estimates of risk?
? Surprise . New, potentially significant
information on the horizon? . which
direction would the risks move if the data
gaps were filled? - How sure are we about the effectiveness of
remediation options in reducing exposures and
risks?
9The truth is
- An entirely scientific risk assessment is a
mirage. There is no single right way to do it.
... risk assessment is not and cannot be a wholly
scientific undertaking. Risk assessment
often turns upon details that are inherently
unknowable. In general, probabilistic and
holistic risk assessments could lead to improved
decision-making. Whether such assessments prove
to be more defensible than the status quo is
harder to say.Edmund Crouch, et. al. Report
to the Commission on Risk Assessment and Risk
Management (October,1995)
10So far, our track record could be better ...
- Leads to Better Decisions? How would we
know? What criteria would we use? What are the
regulatory decision-makers criteria? - Scientific Credibility or Just More Confusing
Scientific Debate?
11Institutional Barriers
- Managers are having a hard time seeing the real
value added loss of intuitive connectivity
analysis paralysis budgets staffing
issues undercut legal defensibility creates
an imposing slate of choices that both
risk assessors and risk managers are
unaccustomed to making
12More Barriers
- Can't tell a "good one" from a "bad one" ? When
experts dont agree? Used as a scientific
way to challenge EPA? overly-confident about
our uncertainty? if the regulated community
thinks it is good, then it must be bad? too
easily manipulated? difficult to detect judge
effects of manipulations - When should it be used? When shouldnt it be
used?
13Even More Barriers
- Poorly framed assessment questions leading to
poorly focused analyses - When is it appropriate?
- Absence of technical guidance good examples
- Training, Staffing, and Resources
- Difficult to document and review analyses as
model complexity increases
14More Technical Barriers
- marginal data models
- questions about the use of default distributions
- too much judgment
- poor representation in the distribution tails,
rare events - variability and uncertainty
- site-specific applications
15A Bureaucratic Response to a Changing Paradigm
- Striking but Predictable Features of Our
Response ? measured pace ? a need to
control ? progress as well as lack of progress
is dependent on a few motivated individuals - Dimensions of Our Response ? Workshops ?
Committees ? Guidance documents
16Adam Finkel Risk Analysis, Vol. 14, No. 5, pages
751-761
...will it take another 10 years or more to pass
over the next major hurdle in the evolution of
risk management methodology and practice -
namely, the routine reliance on quantitative
uncertainty analysis (QUA) as the lode-star of
decision-making rather than as a nicety of risk
characterization or as a risk analysis appendage
useful only in hindsight? However long this
advance takes, part of the blame for the delay
will rest on the shoulders of practitioners of
QUA, who have to date concentrated on getting
scientific and regulatory decision-makers to
acknowledge the magnitude of the uncertainties
facing them and to understand how QUAs are
conducted.. In this we have risked making
ourselves akin to mousetrap salesmen who
beleaguer the consumer with engineering details
before he even understands that, if the gadget
works, the result will be a house free of
mice. ...but the fact is that we havent
stressed nearly enough that it is useful, first
and foremost. As a result, our potential
consumers ... have understandable trouble
envisioning the precarious position they are in
without our better mousetrap, and worse yet,
they tend to fixate on the downsides - the
perceived cost and danger involved in buying the
product