Title: Patricia M' Dehmer
1Strategic Planning and Performance Assessment in
DOEs Office of Science
Patricia M. Dehmer Associate Director of Science
for Basic Energy Sciences 27 February 2002
2I think you should be more explicit here in step
two.
3SC Portfolio Components
Program Direction 139.9M
FY 2001 Budget 3,140.9 M
Safeguards Security 34.4M
Accelerator Improvement Projects 28.4M
Construction 315.5M
Research (Universities) 709.2M
General Plant Projects General Purpose
Equipment 50.7M
Capital Equipment 224.8M
Research (Laboratories) 785.8M
Major Scientific User Facilities 852.2M
Includes the funding for not-for-profits,
other agencies, and private institutions
4SC Research Areas
- Chemical Sciences
- Analytical Chemistry
- Atomic, Molecular Optical Sciences
- Chemical Kinetics
- Chemical Physics
- Catalysis
- Combustion Dynamics
- Electrochemistry
- Heavy Element Chemistry
- Interfacial Chemistry
- Organometallic Chemistry
- Photochemistry
- Photosynthetic Mechanisms
- Radiation Chemistry
- Separations Science
- Solar Energy Conversion
- Theory, Modeling, Computer Simulation
- Thermophysical Properties
- Materials Sciences and Engineering
- Catalysis
- Ceramics
- Condensed Matter Physics
- Corrosion
- Electronic Properties of Materials
- Experimental Techniques Instrument Devel.
- Fluid Dynamics and Heat Flow
- Intermetallic Alloys
- Magnetism and Magnetic Materials
- Materials Physics and Chemistry
- Mechanical, Physical, and Structural Properties
- Metallic Glasses
- Metallurgy, Metal Forming, Welding Joining
- Nano- and Microsystems Engineering
- Neutron and Photon Scattering
- Nondestructive Evaluation
- Photovoltaics
- Polymer Science
- Life Sciences
- Human Genome
- Structural Biology
- Microbial Genome
- Low Dose Radiation Research
- Functional Genomics
- Human Subjects in Research
- Structural Biology Facilities
- Genome Instrumentation
- Computational Structural Biology
- Medical Sciences
- Molecular Radiopharmaceutical Development
- Boron Neutron Capture Therapy
- Molecular Nuclear Medical Imaging
- Imaging Gene Expression
- Biomedical Engineering
- Environmental Sciences
- BES - Basic Energy Sciences
- HENP - High Energy and Nuclear Physics
- FES - Fusion Energy Sciences
- BER - Biological Environmental Research
- ASCR - Advanced Scientific Computing Research
5SC Major User Facilities And the smaller ones,
too!
Basic Energy Sciences LIGHT SOURCES Advanced
Light Source (ALS) Advanced Photon Source
(APS) National Synchrotron Light Source
(NSLS) Stanford Synchrotron Radiation Laboratory
(SSRL) NEUTRON SOURCES High-Flux Isotope Reactor
(HFIR) Intense Pulsed Neutron Source (IPNS) Los
Alamos Neutron Science Center (LANSCE) Spallation
Neutron Source (SNS) MICROCHARACTERIZATION Center
for Microanalysis of Materials (CMM) Electron
Microscopy Center (EMC) National Center for
Electron Microscopy (NCEM) Shared Research
Equipment Program (SHaRE) SPECIALIZED
SINGLE-PURPOSE CENTERS Combustion Research
Facility (CRF) Materials Preparation
Center Surface Modification and Characterization
Center James R. Macdonald Laboratory Pulse
Radiolysis Facility Biological and Environmental
Research William R. Wiley Environmental Molecular
Sciences Laboratory Joint Genome Institute
(JGI) Laboratory of Comparative and Functional
Gemonics Atmospheric Radiation Measurement (ARM)
Facilities AmeriFlux Free-Air Carbon Dioxide
Enrichment (FACE) Facilities Natural and
Accelerated Bioremediation Research Field
Research Center
High Energy Physics Fermi National Accelerator
Laboratory (Fermilab) -Tevatron
Antiproton-Proton Collider -800 GeV Fixed
Target Program Stanford Linear Accelerator Center
(SLAC) -Linear Collider (SLC) -91 Gev and
50Gev Fixed Target Program -B-factory Nuclear
Physics LOW ENERGY HEAVY-ION BEAM
FACILITIES Argonne Tandem Linear Accelerator
System (ATLAS) 88-Inch Cyclotron Holifield
Radioactive Ion Beam Facility NON-ACCELERATOR
NUCLEAR PHYSICS FACILITIES Sudbury Neutrino
Observatory (SNO) ELECTRON BEAM ACCELERATORS
Thomas Jefferson National Accelerator Facility
(TJNAF) Bates Linear Accelerator
Facility Alternating Gradient Synchrotron
(AGS) COLLIDING BEAM HEAVY ION
ACCELERATORS Realtivistic Heavy Ion Collider
(RHIC) Fusion Energy Sciences Princeton Plasma
Physics Laboratory (PPPL) Alcator
C-Modified DIII-D Advanced Scientific Computing
Research National Energy Research Scientific
Computing (NERSC) Center
6Planning, Assessment, More What must be in
place to plan, manage, assess, and perform these
activities?
7High Energy Physics User Facilities
Tevatron (with D-Zero and CDF detectors) _at_ FNAL
B-factory (with BaBar detector) _at_ SLAC
Alternating Gradient Synchrotron _at_ BNL
8Nuclear Physics User Facilities
Argonne Tandem Linac Accelerator System _at_ ANL
88-Inch Cyclotron _at_ LBNL
Relativistic Heavy Ion Collider AGS _at_ BNL
Holifield Radioactive Ion Beam Facility _at_ ORNL
William H. Bates Linear Accelerator _at_ MIT
Continuous Electron Beam Accelerator Facility _at_
TJNAF
9BES X-ray and Neutron Scattering Facilities
Advanced Photon Source
Intense Pulsed Neutron Source
Advanced Light Source
National Synchrotron Light Source
Spallation Neutron Source
High-Flux Isotope Reactor
Manuel Lujan Jr. Neutron Scattering Center
10BER User Facilities
Structural Biology User Facilities
(synchrotron radiation) ANL, BNL, LBNL, SSRL
Structural Biology User Facilities
(neutron radiation) ORNL, LANL
Field Research Center for Natural and Accelerated
Bioremediation Research
Throughfall Displacement Experiment
Laboratory for Comparative and Functional
Genomics (under construction)
Atmospheric Radiation Measurement (ARM) Site
(plus Alaska and Tropical Western Pacific)
11FES User Facilities
Alcator C-Mod at Massachusetts Institute of
Technology
DIII-D Tokamak at General Atomics
National Spherical Torus Experiment at Princeton
Plasma Physics Laboratory
12ASCR User Facilities NERSC
National Energy Research Scientific Computing
Center
13Long-Range Planning
- Six Federally chartered Advisory Committees
provide substantive long-range planning guidance
to the six major program areas of the Office of
Science. - NRC, COSEPUP, Washington Advisory Group, JASON,
SEAB, focused workshops, etc. are used for
long-range planning. Also, Interagency Working
Groups provide both the Administration
perspective and federal coordination.
ASCAC BESAC BERAC FESAC HEPAP NSAC
BES 973.8M
BER 514.1M
FES 241.9M
Program FY01 Funding
ASCR 161.3M
HEP 695.9M
NP 351.8M
14Long-Range Planning Federally Chartered
Advisory Committees Impact Entire Fields
- High Energy Physics Advisory Committee
- Vision for the Future of High-Energy Physics (May
1994 Sid Drell, Chair) - Accelerator-Based Neutrino Oscillation
Experiments (September 1995 Frank Sciulli,
Chair) - Planning for the Future of High-Energy Physics
(February 1998 Fred Gilman, Chair) - White Paper on Planning for U.S. High-Energy
Physics (October 2000 Fred Gilman, Chair) - Long-Range Planning for U.S. High-Energy Physics
(January 2002 Jonathan Bagger and Barry Barish,
Co-chairs) - Nuclear Sciences Advisory Committee
- NSAC Long Range Plan Working Group (February
1996 Ernest Moniz, Chair) - DOE Medium Energy Program Review (September 1998
James Symons, Chair) - NSAC Long Range Plan Working Group (Report due
2002 James Symons, Chair) - DOE Low Energy Program Review (Report due 2002
Brad Filippone, Chair) - Fusion Energy Sciences Advisory Committee
- A Restructured Fusion Energy Sciences Program,
FESAC (1996) - Report on the Nature and Level of U.S.
Participation in Possible ITER Activities, FESAC
(1998) - Realizing the Promise of Fusion Energy, Final
Report of Task Force, SEAB (1999) - Fusion Science Assessment Committee, Interim
Assessment, NRC (1999) - Priorities and Balance within the Fusion Energy
Sciences Program, FESAC (1999)
15Long-Range Planning Federally Chartered
Advisory Committees Impact Major Subfields
- Basic Energy Sciences Advisory Committee
- Neutron Source Upgrades and the Specifications
for the SNS (1996) - Research Reactor Upgrades (Robert Birgeneau,
Chair) - Spallation Neutron Source Upgrades (Gabriel
Aeppli, Chair) - Technical Specifications for the Next Generation
Spallation Source (Thomas Russell, Chair) - Novel, Coherent Light Sources (January, 1999
Steve Leone, Chair) - Roadmap for Complex and Collective Systems
(August, 1999) - Neutron Scattering (February, 2000 Martin Blume)
- Biomolecular Materials (February, 2002 Mark
Alper and Sam Stupp, Cochairs) - Challenges in Catalysis (Report due 2002 Michael
White, Chair) - Basic Research Needs for Energy Security (Report
due 2002) - Biological and Environmental Sciences Advisory
Committee - Medical Sciences Instrumentation Research Program
(11/95) - Nuclear Medicine Program (11/95)
- Health Effects Research Program (11/95)
- Global Change Research Program (2/96), (10/96),
(10/98), (3/00), (3/01) - Human Genome Program - Joint Genome Institute
(8/96) Priorities (3/98) Genomes to Life (6/00) - Structural Biology Research Program -
Macromolecular crystallographic use of
synchrotrons for the Interagency Structural
biology Working Group (7/98)
16Assessment of ResearchReview of Intramural and
Extramural Research, including Facilities
- 10 CFR 605 sets forth the procedures applicable
to the award and administration of grants and
cooperative agreements. Evaluation criteria - (1) Scientific and/or technical merit or
the educational benefits of the project (2)
Appropriateness of the proposed method or
approach (3) Competency of applicant's
personnel and adequacy of proposed resources
(4) Reasonableness and appropriateness of the
proposed budget and (5) Other appropriate
factors, established and set forth by SC in a
notice of availability or in a specific
solicitation. - At least 3 independent reviewers are used for
initiation of an award and for renewals every 3-4
years. Mail review, site visits, panel reviews
all acceptable. - Reviewers include experts from universities, the
private sector, government laboratories, other
government agencies, other offices in DOE, and
the international community and include both
technical experts (peers) and users of the
results. - All new proposals from both labs and universities
handled in this way. - Retrospective reviews of labs, performed every
3-4 years, use the same criteria.
17Assessment of Scientific User FacilitiesExamples
of the Involvement of Federally Chartered
Advisory Committees
- Basic Energy Sciences Advisory Committee
- DOE Synchrotron Radiation Sources and Science
(November, 1997 Robert Birgeneau, Chair and
Z.-X. Shen, Vice Chair) - Review of the High Flux Isotope Reactor Upgrade
and User Program (October, 1998 Jack Crow,
Chair) - Review of the Advanced Light Source (February,
2000 Yves Petroff, Chair) - Review of the Electron Beam Microcharacterization
Centers (February, 2000 John Stringer, Chair) - Review of IPNS/LANSCE (December, 2000 Ward
Plummer, Chair) - Committee of Visitors, Chemical Sciences Research
Activities (February 2002 Carl Lineberger,
Chair) - Nuclear Sciences Advisory Committee
- Steady State Operating Costs for CEBAF (July
1992 Derek Lowenstein, Chair) - RHIC Experimental Equipment Review (April 1995
Konrad Gelbke, Chair) - RHIC Operating Budget Review (August 1996
Richard Orr, Chair) - ISOL Taskforce (November 1999 Hermann Grunder,
Chair) - RIA Costing Review (August 2001 Michael
Harrison, Chair) - Advanced Scientific Computing Advisory Committee
- Review of Facilities (January 2002)
18Synchrotron Light SourcesFrom the Province of
Specialists in the 1980s to a Widely Used Tool in
the 21st Century
The number of researchers using the synchrotron
radiation light sources is expected to reach
11,000 annually when beamlines are fully
instrumented.
Who funds the light sources? The Basic Energy
Sciences program provides the complete support
for the operations of these facilities.
Furthermore, BES continues as the dominant
supporter of research in the physical sciences,
providing as much as 85 of all federal funds for
beamlines, instruments, and PI support. Many
other agencies, industries, and private sponsors
provide support for instrumentation and research
in specialized areas such as protein
crystallography.
Facility operations and user statistics are
collected annually Example of data for the
synchrotron radiation light sources
19Assessment of Construction Projects Reviews of
technical scope, cost, schedule baselines and
project management
- SCs Construction Management Support Division
conducts independent reviews of technical scope,
cost, and schedule baselines and project
management of SC construction projects and large
experimental equipment. These reviews are known
as Lehman Reviews after the Division Director,
Dan Lehman. - Lehman Reviews are widely known in DOE, other
agencies, and abroad. Dan Lehman has briefed OMB
and other agencies on the process, which is now
being adopted elsewhere. - A primary responsibility is conducting reviews of
major projects. These are held typically twice
yearly and may include 30-40 independent
technical experts, who are divided into 6-8
subpanels to investigate all aspects of the
project. - Reviews can (and do!) result in substantive
modifications to the project, work stoppage, and
senior management changes.
20The Spallation Neutron Source
21Example of Review of Program ManagementCommittee
of Visitors
Charge to BES Committee of Visitors for chemistry
programs 1. For both the DOE laboratory
projects and the university projects, assess the
efficacy and quality of the processes used
to (a) solicit, review, recommend, and document
proposal actions and (b) monitor active project
and programs. 2. Within the boundaries defined
by DOE missions and available funding, comment on
how the award process has affected (a) the
breadth and depth of portfolio elements, and (b)
the national and international standing of the
portfolio elements. 3. Comment on future
directions proposed by the Division and BES
management and on opportunities that might not
have been presented. 4. Comment on how the
process for these reviews might be improved.
22Summary of Assessment Methods
FY 2001 Budget 3,140.9 M
Prog Dir
SS
10 CFR 605 Advisory Committee reviews
Construction
Research (Universities)
AIP
GPP,GPE
Capital Equipment
Research (Laboratories)
Major Scientific User Facilities
10 CFR 605, adapted to labs Advisory Committee
reviews Annual reporting of operating and user
statistics
Laboratory MO contractors undergo annual
appraisal of overall performance, based on the
results of all of the laboratory reviews.
23SCs Panel on Performance Measurement and GPRA
- In November 2001, the SC Director charged BESAC
with evaluating - SCs current methods for performance measurement
- the appropriateness and comprehensiveness of the
methods - the effects on science programs and
- SC's integration of performance measures with the
budget process as required by the Government
Performance and Results Act. - The Panel was comprised of one member from each
SC Advisory Committee and a few external
participants with expertise in performance
measurement. - A report on findings and recommendations was made
to BESAC meeting on February 25, 2002.
24Panel Overview
The research program of SC is an outstanding
program, which has been remarkably successful in
advancing basic research in the U.S., developing
world-leading research in a number of important
areas, and developing both an important research
infrastructure and a remarkable set of major user
facilities. The processes being developed in the
GPRA management plans should help to make these
contributions better understood by the
stakeholders and assist SC in managing the
existing program and developing the case for
further advancements. All the parties involved
in this exercise are in alignment with this view
and are trying to develop procedures, which will
help improve this valuable program and avoid
introducing processes which would harm it. Our
discussion points are not intended to criticize
any of the contributors to this exercise, but to
help in pointing directions that seem to us to
need attention. In particular, our concern is
with the development and maintenance of a
world-leading program in basic research within a
mission-oriented agency.
25The Panel recommends that
- SC complete its new Strategic Plan as soon as
possible - SCs performance assessment methods continue to
be followed - SCs performance measurement criteria be aligned
with those developed by COSEPUP and with its
ongoing studies on investment criteria for basic
research to allow a common basis for the
different Federal agencies that support basic
research programs - the discussions between SC and OMB as to
appropriate criteria for the assessment of the
progress of basic science programs be continued
to allow the development of appropriate metrics - criteria to assess the world leadership of SCs
research be developed - work-force issues including the development of
succession plans for the research staff and the
education and training of a technically
sophisticated personnel reservoir for the future
of the nation be incorporated into the GPRA goals
of SC
26Strategic Objectives (SOs)Program Strategic
Performance Goals (PSPGs)
Those Panel members from the SC Advisory
Committees considered that the set of these SOs
and PSPGs for the parts of the programs with
which they are familiar distorted the aims and
accomplishments of SC research programs. With
PSPGs that are only representative and not at all
comprehensive, the SCs programs are portrayed as
significantly less than they truly are. The
Panel was concerned that this could even be
detrimental to programs where their mis-portrayal
could lead to unfortunate misunderstandings.
The budget submission then fails as an effective
communication tool, which is one of its most
important roles. The full budget submission is a
much larger document, containing much more
detail, of course but the Panel believes that
the opening Executive Summary should be
consistent with the GPRA wording that an agency
may aggregate, disaggregate, or consolidate
program activities, except that any aggregation
or consolidation may not omit or minimize the
significance of any program activity constituting
a major function or operation for the agency.
27Qualitative vs. Quantitative Measures
There is no doubt that it is easier for the sort
of comparative assessments that have to be made
in a budgeting process if the annual results of
the programs can be expressed in objective
quantitative terms but it is clear from the
description of the peer review process above that
these assessments are generally qualitative.
The Subpanel believes that much basic research
is better assessed in qualitative terms. While
this offers challenges to the concept of being
measurable, this should not lead to the
imposition of quantitative goals. To do this
would have significant negative effects on basic
research and would certainly not be consistent
with the principle that application of GPRA
should do no harm, a principle that is agreed
to by all the participants in this exercise. In
its ongoing discussions with OMB, this issue
should be reviewed.
28Experience in Other Related Federal Agencies
Other Federal Agencies also support basic
research, notably NSF, NIH, NASA, and DOD. All
of these agencies are different, and the Panel
recognizes that this will lead to differences in
the ways in which OMB will wish to see
performance assessed. However, there will be some
overlap in the character of specific basic
research programs, and the Panel believes that it
would be worthwhile in SCs ongoing discussions
with OMB for this aspect to be reviewed in
relation to the development of appropriate goals
and metrics.
29What is Assessed (?) and Best Practices ( )
30What is Assessed (?) and Best Practices ( )
31High Energy Physics
Strategic Objectives SC1 Answer two key
questions about the fundamental nature of matter
and energy. Determine whether the Standard Model
accurately predicts the mechanisms that breaks
the symmetry between natural forces and generates
mass for all fundamental particles by 2010 or
whether an alternate theory is required, and on
the same timescale determine whether the absence
of antimatter in the universe can be explained by
known physics phenomena. SC7 Provide major
advanced scientific user facilities where
scientific excellence is validated by external
review average operational downtime does not
exceed 10 of schedule construction and upgrades
are within 10 of schedule and budget and
facility technology research and development
programs meet their goals. Progress toward
accomplishing these Strategic Objectives will be
measured by Program Strategic Performance Goals,
Indicators and Annual Targets, as
follows Program Strategic Performance
Goals SC1-1 Exploit U.S. leadership at the
energy frontier by conducting an experimental
research program that will establish the
foundations for a new understanding of the
physical universe. (Research and Technology
subprogram and HEP Facilities subprogram). Perform
ance Indicator Amount of data delivered and
analyzed Number of significant scientific
discoveries. Performance Standards As discussed
in Corporate Context/Executive Summary.
Strategic Objectives 5-10 years in outlook
Program Strategic Performance Goals 3-5 years
in outlook
Annual Performance Results and Targets
Targets Annual milestones
FY 2001 Results
FY 2002 Targets
FY 2003 Targets
Completed first phase of upgrades to enable the
Tevatron at Fermilab to run with much higher
luminosity. Began commissioning of phase-one
accelerator upgrades.
Deliver integrated luminosity as planned (80
pb-1) to CDF and D-Zero at the Tevatron. Begin
implementation of second phase of accelerator
upgrades install four performance improvements
to existing systems.(SC1-1)
Deliver integrated luminosity as planned (250
pb-1) to CDF and D-Zero at the Tevatron. Complete
and install two new accelerator systems. Design
new device to improve yield in antiproton target.
(SC1-1)
The challenge now is to assemble the details of
GPRA, COSEPUP recommendations, and the
Administrations investment criteria into the
annual budget submissions. These budget
documents transform GRPA, COSEPUP, and investment
criteria from theory to practice.
Completed and commissioned upgrades of the CDF
and D-Zero detectors at the Tevatron facility at
Fermilab.
Collect data and begin analysis. (SC1-1)
Take data with high efficiency record over 60
of available data and continue analysis.
(SC1-1)
32Not everything that can be counted counts, and
not everything that counts can be counted.
33(No Transcript)