Title: Seaborg Retrospective
1Etraordinary
Seaborg A Reliable Resource for Extraordinary
Science Richard Gerber NERSC User
Services RAGerber_at_lbl.gov ScicomP 14 May 21,
2008 Poughkeepsie, N.Y.
2Third Pillar of Science
It is often said that science is based on two
pillars, namely experiment and theory. In fact,
high-end computation, especially through
simulation, is the third pillar of science. It
gives us the ability to simulate things which
cannot be done either experimentally or for which
no theory exists -- simulations of complex
systems which are simply too hard or so far are
not amenable to analytic approaches. So you need
to think of high-end computations as a third
pillar for discovery, and it gives the United
States an advantage in industry, in science, and
an instrument of discovery that I believe cannot
be achieved elsewhere. Dr. Raymond L. Orbach,
Under Secretary for Science, U.S. Department of
Energy at the 2007 INCITE Awards Announcement
Council on Competitiveness Washington, DC
January 8, 2007
3Working Together
- High Performance Computing is a complex activity
and requires contributions from many parties. - IBM, HPC centers, scientists (domain and CS), DOE
programs and funding, taxpayers - All are integral players the entire endeavor
collapses without any one. - Any and all successes are achievements for, and
by, all involved.
4(No Transcript)
5NERSCs Contribution
- NERSCs tries to maximize the return on societys
investment in High Performance Computing. Among
the ways we do this - Provide maximal computing, storage, network, and
services for available dollars - Provide appropriate and useful resources
- Provide highly available, reliable resources
- Make systems easy to use productively at high
performance
6Seaborg 2001-2008
From 2001 to 2008 NERSCs IBM SP Seaborg provided
a unique resource a pillar of open scientific
computing. By any measure it was a spectacular
success for science, DOE, NERSC, and IBM.
7Seaborgs Successes
- Seaborg success stories are far too many to
enumerate. - What follows is an arbitrary sampling of
scientific advances enabled by Seaborg.
8Glenn Seaborg
- Research scientist, discoverer of countless
atomic isotopes and 10 elements, including
plutonium and the element that now bears his
name, seaborgium. - Section head in the top-secret Manhattan Project.
- Chancellor of the University of California at
Berkeley. - Chairman of the U.S. Atomic Energy Commission
under presidents Kennedy, Johnson, and Nixon. - Co-founder and Chairman of the Lawrence Hall of
Science. - Associate Director-at-Large of Lawrence Berkeley
National Laboratory, the nations first national
research facility. - Member of President Reagans National Commission
on Excellence in Education. - Awarded the Nobel Prize in 1951
9Seaborg Factoids
- Installed in 2001
- 3,008 compute processors
- 188 compute nodes
- 4.5 TB main memory
- June 2001 Top 500 2
- Expanded in 2003
- 6,144 compute processors
- 384 nodes
- 7.8 TB main memory
- 44 TB disk space
- In its prime Seaborg was the largest unclassified
system in the world
10Usage and Users
- 2001 to 2008 Seaborg provided 271 million
processor hours to science - Equivalent of 31,000 years of single-processor
computation - 3,000 unique users
- Users from all fields of science
- Users from DOE labs, universities, industry,
other federal agencies
11User Survey Seaborg Uptime
Scientists recognize and appreciate that Seaborg
was a reliable resource as shown in ratings for
Seaborg uptime on annual surveys.
2002
2003
2004
2005
2006
12Scientists Used Seaborg to Learn How to Scale
Codes
13Seaborg Proved You Can Get High Performance at
Scale
Electromagnetic Wave-Plasma Interactions Donald
Batchelor, ORNL, 1.026 Gflop/s per processor (68
of peak), 2 Tflop/s on 1,936 processors, 1.86 TB
of memory Cosmic Microwave Background Data
Analysis Julian Borrill, LBNL and UC Berkeley,
750 Mflop/s per processor (50 of peak), 3.02
Tflop/s on 4,096 processors, 3.1 TB of
memory. Terascale Simulations of Supernovae
Anthony Mezzacappa, ORNL, 654 Mflop/s per
processor (43 of peak), 1.34 Tflop/s on 2,048
processors, 783 GB of memory.
14Seaborg Science
- The number of publications in refereed journals
is the measure of scientific productivity - To date we estimate approximately 8,000 refereed
scientific publications have been published based
on calculations performed on Seaborg
15Seaborg Science Covers
The best and most exciting publications are
highlighted on scientific journal covers.
16DOE Climate ChangePrediction Program
W. Washington, P. Gent, J. Hack, J. Kiehl, G.
Meehl, and P. Rasch, National Center for
Atmospheric Research B. Semtner, Naval
Postgraduate School J. Weatherly, U.S. Army Cold
Regions Research and Engineering Lab Laboratory.
1,000-year simulation of a powerful new Community
Climate System Model (CCSM2) ran for more than
200 uninterrupted days. Each simulated year
required 4.5 wall-clock hours. The 1,000-year
simulation demonstrated the ability of CCSM2 to
produce a long-term, stable representation of the
earths climate.
2002
172007 Nobel Prize
- "The Norwegian Nobel Committee has decided that
the Nobel Peace Prize for 2007 is to be shared,
in two equal parts, between the Intergovernmental
Panel on Climate Change (IPCC) and Albert Arnold
(Al) Gore Jr. for their efforts to build up and
disseminate greater knowledge about man-made
climate change, and to lay the foundations for
the measures that are needed to counteract such
change."
18Seaborgs Role
- Seaborg was a critical part of the CCSM
development and production simulations for the
Intergovernmental Panel on Climate Change (IPCC)
Fourth Assessment Report (IPCC AR4) carried out
by Warren Washington's group at NCAR. - Seaborg was responsible for the IPCC A2 scenario,
the most extreme of the four scenarios considered
in the AR4 - The CCSM data submission to the IPCC process was
the largest of any modeling group in the world
and the 2007 IPCC AR4 report proved to be a
turning point for public perception of climate
change, both in the US and internationally.
19Black Hole Merger Simulations
First time ever that a spiraling merger of this
type was accurately simulated. The Cactus code
performs a direct evolution of Einsteins
equations, which has vast CPU and memory
requirements. This simulation used 1.5 TB of
memory and more than 2 TB of disk space for each
run.
E. Seidel, M. Alcubierre, G. Allen, B. Brügmann,
P. Diener, D. Pollney, T. Radke, and R.
Takahashi, Max Planck Institute for Gravitational
Physics J. Shalf, Lawrence Berkeley National
Laboratory.
Visualization of binary black hole merger
These runs used 1,024 processors for 48
wall-clock hours at a stretch.
2002
20Origin of Gamma Ray Bursts
Stan Woosley of UC Santa Cruz ran the first 3D
simulations of the collapsar model he developed
to explain the origin of gamma ray bursts. 135
different theories on the origin of GRBs had
previously been published.
In March 2003, data from NASAs High-Energy
Transient Explorer, which detected an extremely
bright gamma ray burst, validated the collapsar
model.
21Scientists Compute Death Throes of White Dwarf
Star in 3D
University of Chicago scientists demonstrated how
to incinerate a white dwarf star in unprecedented
detail at the Paths to Exploding Stars
conference on March 22, 2007, in Santa Barbara,
Calif.
The Flash Center team for the first time
naturally detonated a white dwarf in a more
realistic three-dimensional simulation. The
simulation confirmed what the team already
suspected from previous tests that the stars
detonate in a supersonic process resembling
diesel-engine combustion.
http//www.lbl.gov/CS/Archive/news032207.html
22Laser Wakefield Acceleration Channeling the Best
Beams Ever
Researchers at LBL took a giant step toward
realizing the promise of laser wakefield
acceleration. Goal of producing compact,
energetic beams for studying materials and for
medical applications. Design and understanding
guided by calculations performed on Seaborg.
http//www.lbl.gov/Science-Articles/Archive/AFRD-l
aser-wakefield.html
2004
23Tuning the Nanoworld
Scientists at Lawrence Berkeley National
Laboratory have found new ways of combining
quantum dots and segmented nanorods into multiply
branching forms and have applied new ways to
calculate the electronic properties of these
nanostructures. This understanding will aid in
the development of designer nanostructures for
applications in such as quantum computing and
artificial photosynthesis.
NERSC calculations yield atom-by-atom electronic
maps of a tetrapod with one leg of cadmium
selenide and three of cadmium telluride. On the
left, green marks the conduction band's lowest
energy state, which is physically separated in
the structure from the valence band's highest
energy state, shown in green on the right.
http//www.lbl.gov/Science-Articles/Archive/MSD-tu
ning-the-nanoworld.html
2004
24Modeling a Cosmic Bomb
The first evidence of polarization in a normal
Type Ia supernova were found in SN 2001el. The
otherwise normal spectrum showed an unusual
glitch which led to the best-ever supercomputer
models of the shapes of these exploding stars.
Peter Nugent and Dan Kasen of Berkely Lab modeled
four geometries, each requiring some 20,000
processing hours. They generated synthetic
spectra to see which most closely matched the
observations.
http//www.lbl.gov/Publications/Currents/Archive/O
ct-31-2003.htmlstory3
2006
25Cosmic Microwave Background Simulations of Planck
Satellite Data
In 2008 the European Space Agency, with
substantial NASA participation, will launch the
Planck satellite on a mission to map the cosmic
microwave background (CMB), the remnant radiation
believed to be an echo of the Big Bang that
started the Universe.
Planck is designed to map CMB temperature and
polarization fluctuations with unprecedented
resolution and sensitivity, but the enormous
volume of data this will generate poses a major
computational challenge. Can such a mountain of
data be efficiently processed?
2004
26Cosmic Microwave Background Simulations of Planck
Satellite Data
Yes! Julian Borrill and collaborators simulated
analysis of a full years worth of Planck data
75 billion observations mapped to 150 million
pixels
6,000 CPUs for 2 hours
4 TB Memory
2.5 TB Disk Space on GFPS
2004
27The holy grail of combustion science
turbulencechemistry interactions
Understanding turbulencechemistry interactions
is the holy grail of combustion science. These
effects can cause the flame to burn faster or
slower and to create more or less pollution.
The first 3D simulation of a laboratory scale
turbulent flame from first principles was
featured on the cover of the July 19, 2005
Proceedings of the National Academy of Sciences.
John Bell, et. al, LBNL
28Fusion Energy Reactor Design
Research funded by the Office of Fusion Energy
Sciences reached a milestone in 2001 with the
first simulation of turbulent transport in a
full-sized reactor plasma by researchers at the
Princeton Plasma Physics Laboratory. This
breakthrough full-torus simulation, which
produced important and previously inaccessible
new results, used 1 billion particles, 125
million spatial grid points, and 7,000 time
steps. It was made feasible by a new generation
of software and hardwarebetter physics models
and efficient numerical algorithms, along with
NERSC's new 5 teraflop/s IBM SP.
2001
29In Summary
Seaborg has been quite simply outstanding in the
high degree of reliability, Robert Harkness,
UC-San Diego, INCITE 8 High resolution
hydrodynamical cosmological simulations of the
structure of the high redshift intergalactic
medium. 5,472-CPU runs for 24 hours at a
time, 10 TB disk space