Ensuring%20Our%20Nation PowerPoint PPT Presentation

presentation player overlay
About This Presentation
Transcript and Presenter's Notes

Title: Ensuring%20Our%20Nation


1
Ensuring Our Nations Energy Security
Computational Challenges and Directions in the
Office of Science Science for DOE and the
Nation www.science.doe.gov
Fred Johnson Advanced Scientific Computing
Research SOS7, March 2003
2
Outline
  • Background Computational science in the Office
    of Science
  • SciDAC
  • Ultrascale Scientific Computing Capability
  • FY04 Next Generation Computer Architecture
  • FY05 The future revisited

3
The Office of Science
  • Supports basic research that underpins DOE
    missions.
  • Constructs and operates large scientific
    facilities for the U.S. scientific community.
  • Accelerators, synchrotron light sources, neutron
    sources, etc.
  • Five Offices
  • Basic Energy Sciences
  • Biological and Environmental Research
  • Fusion Energy Sciences
  • High Energy and Nuclear Physics
  • Advanced Scientific Computing Research

4
Computational Science is Critical to the Office
of Science Mission
  • Scientific problems of strategic importance
    typically
  • Involve physical scales that range over 5-50
    orders of magnitude
  • Couple scientific disciplines, e.g., chemistry
    and fluid dynamics to understand combustion
  • Must be addressed by teams of mathematicians,
    computer scientists, and application scientists
    and
  • Utilize facilities that generate millions of
    gigabytes of data shared among scientists
    throughout the world.

The Scale of the Problem
Two layers of Fe-Mn-Co containing 2,176 atoms
corresponds to a wafer with dimensions
approximately fifty nanometers (50x 10-9m) on a
side and five nanometers (5 x 10-9m) thick. A
simulation of the properties of this
configuration was performed on the IBM SP at
NERSC. The simulation lasted for 100 hrs. at a
calculation rate of 2.46 Teraflops (one trillion
floating point operations per second). To
explore material imperfections, the simulation
would need to be at least 10 times more compute
intensive.
5
Outline
  • Background Computational science in the Office
    of Science
  • SciDAC
  • Ultrascale Scientific Computing Capability
  • FY04 Next Generation Computer Architecture
  • FY05 The future revisited

6
Scientific Discovery Through Advanced Computation
(SciDAC)
  • SciDAC brings the power of terascale computing
    and information technologies to several
    scientific areas -- breakthroughs through
    simulation.
  • SciDAC is building community simulation models
    through collaborations among application
    scientists, mathematicians and computer
    scientists -- research tools for plasma physics,
    climate prediction, combustion, etc.
  • State-of-the-art electronic collaboration tools
    facilitate the access to these tools by the
    broader scientific community to bring simulation
    to a level of parity with theory observation

    in the scientific enterprise.

7
Introduction
  • SciDAC is a pilot program for a new way of doing
    science
  • spans the entire Office of Science (ASCR, BES,
    BER, FES, HENP)

  • 37M 2M 8M 3M 7M
  • involves all DOE labs and many universities
  • builds on 50 years of DOE leadership in
    computation and mathematical software (EISPACK,
    LINPACK, LAPACK, BLAS, etc.)

8
Addressing the Performance Gapthrough Software
  • Peak performance is skyrocketing
  • In 1990s, peak performance increased 100x in
    2000s, it will increase 1000x
  • But ...
  • Efficiency for many science applications declined
    from 40-50 on the vector supercomputers of 1990s
    to as little as 5-10 on parallel supercomputers
    of today
  • Need research on ...
  • Mathematical methods and algorithms that achieve
    high performance on a single processor and scale
    to thousands of processors
  • More efficient programming models for massively
    parallel supercomputers

1,000
Peak Performance
100
Performance Gap
Teraflops
10
1
Real Performance
0.1
2000
2004
1996
9
Its Not Only Hardware!
Updated version of chart appearing in Grand
Challenges High performance computing and
communications, OSTP committee on physical,
mathematical and Engineering Sciences, 1992.
10
SciDAC Goals
  • an INTEGRATED program to
  • (1) create a new generation of scientific
    simulation codes that take full advantage of the
    extraordinary capabilities of terascale computers
  • (2) create the mathematical and computing systems
    software to enable scientific simulation codes to
    effectively and efficiently use terascale
    computers
  • (3) create a collaboratory software environment
    to enable geographically distributed scientists
    to work effectively together as a TEAM and to
    facilitate remote access, through appropriate
    hardware and middleware infrastructure, to both
    facilities and data
  • with the ultimate goal of advancing fundamental
    research in science central to the DOE mission

11
CSE is Team-Oriented
  • successful CSE usually requires teams with
    members and/or expertise from at least
    mathematics, computer science, and (several)
    application areas
  • language and culture differences
  • usual reward structures focus on the individual
  • incompatible with traditional academia
  • SciDAC will help break down barriers and lead by
    example DOE labs are a critical asset

12
The Computer Scientists View
Must have Fortran!
Must have cycles!
Must study climate!
Must move data!
13
Applications Scientist View
Computer Scientist
Complexity
Applications Scientist
14
Future SciDAC Issues
  • additional computing and network resources
  • initial SciDAC focus is on software, but new
    hardware will be needed within the next two years
  • both capability and capacity computing needs are
    evolving rapidly
  • limited architectural options available in the
    U.S. today
  • topical computing may be a cost-effective way of
    providing extra computing resources
  • math and CS research will play a key role
  • expansion of SciDAC program
  • many important SC research areas (e.g.,
    materials/nanoscience, functional
    genomics/proteomics) are not yet included in
    SciDAC NSRCs, GTL

15
Outline
  • Background Computational science in the Office
    of Science
  • SciDAC
  • Ultrascale Scientific Computing Capability
  • FY04 Next Generation Computer Architecture
  • FY05 The future revisited

16
MotivationUltraScale Simulation Computing
Capability
  • Mission need Energy production, novel
    materials, climate science, biological systems
  • Systems too complex for direct calculation
    descriptive laws absent.
  • Involve physical scales up to 50 orders of
    magnitude
  • Several scientific disciplines, e.g., combustion
    materials science
  • Experimental data may be costly to develop,
    insufficient, inadequate or unavailable and
  • Large data files (millions of gigabytes) shared
    among scientists throughout the world.
  • History of Accomplishments
  • MPI, Math libraries, first dedicated
    high-performance computing center, SciDAC

17
ASCAC Statement
Without robust response to Earth Simulator, U.S.
is open to losing its leadership in defining and
advancing frontiers of computational science as
new approach to science. This area is critical
to both our national security and economic
vitality. (Advanced Scientific Computing
Advisory Committee May 21, 2002).
18
Simulation Capability NeedsFY2004-05 Timeframe
 
19
Key Ideas
  • Deliver a full-suite of leadership class
    computers for science with broad applicability.
  • Establish a model for computational sciences
    (SciDAC and base programs) that couples
    applications scientists, mathematicians, and
    computational and computer scientists with
    computer designers, engineers, and semiconductor
    researchers.
  • Develop partnerships with domestic computer
    vendors to ensure that leadership class computers
    are designed, developed, and produced with
    science needs as an explicit design criterion.
  • Partner with other agencies.
  • Partner with industry on applications.

20
FY 2004 Request to OMBUSSCC
  • UltraScale Scientific Computing Capability
  • Supporting RD 30
  • Research with Domestic Vendors Develop
    ultrascale hardware and software capabilities for
    advancing science, focusing on faster
    interconnects and switches.
  • Continue 2 partnerships begun in FY2003
  • Initiate additional partnerships ( up to 3) in
    FY2004, based on competitive review
  • Operating Systems, Software Environments, and
    Tools
  • Address issues to ensure scalability of operating
    systems to meet science needs
  • Develop enhanced numerical libraries for
    scientific simulations
  • Develop tools to analyze application performance
    on ultrascale computer systems
  • University-based Computer Architecture Research
    Explore future generations of computer
    architectures for ultrascale science simulation.

21
FY 2004 Request to OMBUSSCC
  • UltraScale Scientific Computing Capability
  • Computing and Network Facilities- 70
  • Computer architecture evaluation partnerships-
    Evaluate computer architectures at levels to
    ensure that computer hardware and systems
    software balanced for science and likely to
    successfully scale
  • Continue partnership established in FY2002
    between ORNL and Cray, Inc.
  • Initiate one new partnership, comprised of
    scientists and engineers from a domestic computer
    vendor, with computer scientists, and
    applications scientists supported by the Office
    of Science.
  • Award partnership from a competition among
    invited vendors
  • Begin installation of first ultrascale computing
    system for science

22
Outline
  • Background Computational science in the Office
    of Science
  • SciDAC
  • Ultrascale Scientific Computing Capability
  • FY04 Next Generation Computer Architecture
  • FY05 The future revisited

23
Next Generation Computer Architecture
  • Goal Identify and address major hardware and
    software architectural bottlenecks to the
    performance of existing and planned DOE science
    application
  • Main Activities
  • Architecture impacts on application performance
  • OS/runtime research
  • Evaluation testbeds

24
Outline
  • Background Computational science in the Office
    of Science
  • SciDAC
  • Ultrascale Scientific Computing Capability
  • FY04 Next Generation Computer Architecture
  • FY05 The future revisited

25
How full is the glass?
  • Support and enthusiasm within the Office of
    Science
  • Office of Science Strategic Plan
  • Interagency cooperation/coordination
  • NSA SV2
  • DOD IHEC
  • DARPA HPCS
  • NNSA program reviews, open source, NAS study,
    Red Storm,
  • DARPA/DOD/SC USSCC meeting
  • OSTP/NNSA/DOD/SC NGCA meeting
  • OSTP support
  • International coordination
  • Hawaii meeting
  • ES benchmarking

26
Agency Coordination Overview Matrix
Research Coordination Development Coordination Strategy Coordination
NNSA 17M research funded at NNSA laboratories Red Storm development Formal coordination documents
DOD DUSD Science and Technology IHEC study
DARPA HPCS review team HPCS evaluation system plan
NSA UPC Cray SV2/X1 development
All Agencies HECCWG
27
NNSA Details
Research Coordination Development Coordination Strategy Coordination
NNSA X 17M research funded at NNSA laboratories, Light weight kernel, common component architecture, performance engineering, X Red Storm development quarterly review meetings, ASCI Q review, ASCI PSE review, SciDAC reviews, X Formal coordination documents, joint funded NAS study, open source software thrust, platform evaluation
28
NSA Details
Research Coordination Development Coordination Strategy Coordination
NSA X UPC (Lauren Smith), Programming Models (Bill Carlson), Benchmarking (Candy Culhane) X Cray SV2/X1 development, Cray Black Widow development (quarterly review meetings)
29
DOD and DARPA Details
Research Coordination Development Coordination Strategy Coordination
DOD DUSD Science and Technology X IHEC study, agreement on SC role in IHEC
DARPA X HPCS review team Phase I, Phase II and Phase II Review Cray, IBM, HP, SUN and SGI projects X HPCS evaluation system plan, agreement on SC role as HPCS early evaluator at scale
30
DARPA High ProductivityComputing Systems Program
(HPCS)
  • Goal
  • Provide a new generation of economically viable
    high productivity computing systems for the
    national security and industrial user community
    (2007 2010)
  • Impact
  • Performance (efficiency) critical national
    security applications by a factor of 10X to 40X
  • Productivity (time-to-solution)
  • Portability (transparency) insulate research and
    operational application software from system
  • Robustness (reliability) apply all known
    techniques to protect against outside attacks,
    hardware faults, programming errors

HPCS Program Focus Areas
  • Applications
  • Intelligence/surveillance, reconnaissance,
    cryptanalysis, weapons analysis, airborne
    contaminant modeling and biotechnology

Fill the Critical Technology and Capability
Gap Today (late 80s HPC technology)..to..Future
(Quantum/Bio Computing)
31
Computing Metric Evolution
32
Memory System Performance LimitationsWhy
applications with limited memory reuse perform
inefficiently today
  • STREAMS ADD Computes A B for long vectors A
    and B (historical data available)
  • New microprocessor generations reset
    performance to at most 6 of peak
  • Performance degrades to 1 - 3 of peak as clock
    speed increases within a generation
  • Goal benchmarks that relate application
    performance to memory reuse and other factors

33
Phase I HPCS Industry Teams
  • Cray, Incorporated
  • International Business Machines Corporation(IBM)
  • Silicon Graphics, Inc. (SGI )
  • Sun Microsystems, Inc.
  • Hewlett-Packard Company

34
The Future of Supercomputing
  • National Academy CSTB study
  • Co-funded by ASCR and NNSA
  • 18 month duration
  • Co-chairs Susan Graham, Marc Snir
  • Kick-off meeting 3/6/03
  • The committee will assess the status of
    supercomputing in the United States, including
    the characteristics of relevant systems and
    architecture research in government, industry,
    and academia and the characteristics of the
    relevant market.
  • http//www.cstb.org/project_supercomputing.html

35
High End Computing Revitalization Task Force
  • OSTP interagency thrust
  • HEC an administration priority for FY05
  • Task Force to address
  • HEC core technology RD
  • Federal HEC capability, capacity and
    accessibility
  • Issues related to Federal procurement of HEC
    systems
  • It is expected that the Task Force
    recommendations will be considered in preparing
    the Presidents budget for FY2005 and beyond.
  • Kick-off meeting March 10, 2003
  • Co-chairs John Grosh, DOD and Alan Laub, DOE

36
Links
SciDAC http//www.osti.doe.gov/scidac Genomes
to Life http//www.doegenomestolife.org/ Nanosca
le Science, Engineering, and Technology
Research http//www.sc.doe.gov/production/bes/NNI
.htm http//www.science.doe.gov/bes/Theory_and_Mo
deling_in_Nanoscience.pdf UltraScale Simulation
Planning http//www.ultrasim.info/
Write a Comment
User Comments (0)
About PowerShow.com