DoD High Performance Computing Modernization Program - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

DoD High Performance Computing Modernization Program

Description:

Provide best commercially available high-end HPC capability ... Perform excited-state calculations to determine transition energies (e.g. phosphorescence) ... – PowerPoint PPT presentation

Number of Views:294
Avg rating:3.0/5.0
Slides: 37
Provided by: zill9
Category:

less

Transcript and Presenter's Notes

Title: DoD High Performance Computing Modernization Program


1
DoD High Performance Computing Modernization
Program
A Government/Academic/Industry Collaboration
Bill Zilliox Computer Sciences
Corporation Aeronautical Systems
Center Wright-Patterson Air Force Base
2
Agenda
  • Background
  • Program Elements
  • ASC Major Shared Resource Center
  • PET Program _at_ ASC

3
HPC Modernization Program
  • Established by Congress in 1992
  • Goals
  • Provide best commercially available high-end HPC
    capability
  • Acquire and develop joint need HPC applications,
    software tools and programming environments
  • Educate and train engineers and scientists to
    effectively use advanced computational
    environments
  • Link users and computer sites via high capacity
    networks
  • Promote collaborative relationships among the DoD
    HPC community, the national HPC community and
    Minority Serving Institutes

4
Agenda
  • Background
  • Program Elements
  • ASC Major Shared Resource Center
  • PET Program

5
HPCMP Facets
Hardware
4 Major Shared Resource Centers (MSRCs)
17 Distributed Centers (DCs)
World Class Government and Industry Customers
Software
Network
Program Environment and Training (PET) Common
HPC Software Support Initiative (CHSSI)
Defense Research and Engineering Network (DREN)
6
Shared Resource Centers
  • Major Centers
  • Entire DoD modeling simulation community
  • Large HPC systems
  • 16, 18, 20, 32 on Top 100 Sites (HPC Wire)
  • Full range of resources and support services
  • Batch Processing
  • Distributed Centers
  • Targeted modeling simulation communities
  • Mission specific, often TE
  • Modest HPC systems
  • Limited support services
  • Real time processing, embedded applications

7
Major Shared Resource Centers
Aeronautical Systems Center
Aeronautical
Systems
Army Research Laboratory
Center
Army Engineer Research Development Center
Naval Oceanographic Office
HPC Platforms, Data Storage and Visualization for
DoD Researchers
8
Distributed Centers
Air Armament Center, Eglin AFB, FL AF Flight Test
Center, Edwards AFB, CA AFRL Information
Directorate, Rome, NY AFRL Sensors Directorate,
WPAFB, OH Army High Performance Computing
Research Center, Minneapolis, MN Arnold
Engineering Development Center, Arnold AFB,
TN Arctic Region Supercomputing Center,
Fairbanks, AK Joint National Integration Center,
Schriever AFB, CO Maui High Performance Computer
Center, Kihei, HI Naval Air Warfare Center
Aircraft Division, Patuxent River, MD Naval Air
Warfare Center Weapons Division, China Lake,
CA Naval Research Laboratory, Washington,
DC Redstone Technical Test Center, Redstone
Arsenal, AL Space Missile Defense Command,
Huntsville, AL Space and Naval Warfare Systems
Center, San Diego, CA Tank-Automotive Research,
Development Engineering Center, Warren,
MI White Sands Missile Range, White Sands, NM
9
Defense Research and Engineering Network
  • Secure, digital data transfer over commercial
    telecommunications infrastructure (Virtual
    Private Network)
  • Certified for sensitive unclassified information
    and classified encrypted traffic
  • Robust, high speed service linking over seventy
    geographically dispersed user sites with Shared
    Resource Centers
  • Bandwidth up to OC-12, plans for OC-192
  • Various network protocols

10
DREN Nodes and SDPs
11
Software Applications Support
  • Computational Technology Areas (CTA)
  • Computational Chemistry Materials Science (CCM)
  • Computational Electromagnetics Acoustics (CEA)
  • Computational Electronics Nanoelectronics (CEN)
  • Computational Fluid Dynamics (CFD)
  • Computational Structural Mechanics (CSM)
  • Climate/Weather Ocean Modeling Simulation (CWO)
  • Environmental Quality Modeling Simulation (EQM)
  • Forces Modeling Simulation/C4I (FMS)
  • Integrated Modeling Test (IMT)
  • Signal/Image Processing (SIP)

A framework for community development, technology
planning and modeling simulation support
12
CTA Structure
  • CTA Leader
  • noted government researcher
  • manages CHSSI projects
  • leads user advisory panel
  • PET Lead
  • noted academic/industry researcher
  • manages related PET activities
  • leads PET CTA team
  • on-site at SRCs
  • faculty, students at university
  • The CTA Community

13
HPC User Community
  • 4300 scientists and engineers
  • 100 DoD laboratories, test centers, plus
    universities and industrial sites
  • Annual requirements survey used to
  • Plan capacity increase
  • Determine project allocations
  • 25 of total resource allocated to Challenge
    Projects

14
Software Applications Support
  • Common HPC Software Support Initiative (CHSSI)
  • CTA-focused application software development
    program
  • technical codes which exploit scalable computing
    systems
  • proposals evaluated annually
  • awarded based on relevance and potential impact
  • comprise a large fraction of computational
    workload
  • Programming Environment and Training (PET)
  • CTA-focused assistance initiative
  • enhance the productivity of the user community
  • increase the effective use of HPC resources
  • extend the range of technical solutions
  • promote collaboration and technology transfer

15
PET Program
  • Contract Awarded June 2001
  • Follow-on to previous five year effort
  • Three years with five one year options
  • Awarded to
  • MOS University Consortium
  • High Performance Technologies, Inc.
  • Expanded Technology Focus
  • 10 CTAs, plus cross-community functions
  • Collaborative Distance Learning Technology
  • On-line Knowledge Center
  • Enabling Technologies
  • Computing Environment
  • Education, Outreach Training

16
PET Team Members
Legend HPTi Team MOS
University Consortium
National Center for Supercomputing Applications
Arctic Region Supercomputing Center
Pittsburgh Supercomputing Center
Carnegie-Mellon University
Central State University/Wright Technology Network
University of Michigan
Ohio Supercomputer Center/The Ohio State
University
Morgan State University
Northwest Alliance for Computational Science and
Engineering
Science Applications International Corporation
High Performance Technologies, Inc.
University of Southern California
East Carolina University
University of Tennessee at Knoxville
Clark Atlanta University
University of Texas at El Paso
Albuquerque High Performance Computing Center
Jackson State University
Florida International University
University of Texas at Austin
Mississippi State University
Computer Sciences Corporation
Indiana University
University of Hawaii
17
PET Program
  • Work Scope
  • Core support tasks training, code analysis,
    application evaluation, benchmarks, technology
    tracking, problem analysis, modeling assistance,
    seminars, outreach
  • Annual Project Cycle
  • February - white papers submitted
  • April - full proposals requested
  • July - selection based on CTA relevance,
    potential benefits, UAP feedback
  • October - projects commence, usually 12 month
    duration
  • Technology Master Plan, CTA Roadmaps (updated
    annually)

18
Agenda
  • Background
  • Program Elements
  • ASC Major Shared Resource Center
  • PET Program _at_ ASC

19
Hardware
  • Primary HPC Platforms
  • IBM SP P3
  • 528 Processors (375 MHz)
  • 520 GB Memory
  • 2.4 TB workspace
  • RS/6000 (375MHz)
  • Compaq AlphaServer ES45
  • 836 Processors
  • 836 GB Memory
  • 8 TB workspace
  • Alpha (1 GHz)
  • StorageTek Archive System 500 TB
  • Interconnected via ATM Gb Ethernet
  • Access Control and Security

Peak Capacity 2.5 TeraFLOPS!
20
Software
  • Analysis Codes
  • ABAQUS, ANSYS, CTH,
  • LSDYNA3D, MSC/NASTRANmore
  • AMSOL, Chemkin, CRYSTAL,
  • GAMESS, Gaussian, MOLPROmore
  • Cobalt, Fluent, GASP, ICEM CFD,
  • OVERFLOW, WINDmore
  • I-DEAS, MATHEMATICA, MATLABmore
  • Programming Languages
  • C, C, FORTRAN, LAPACK, MPI, Perl
  • Plus
  • Utilities
  • Visualization Tools

21
Customer Services
  • Customer Assistance
  • HELP Desk
  • Account Center
  • Multimedia Specialist
  • Technical Support
  • Visualization Lab
  • Application Managers
  • Platform Specialists
  • Users Guides, Documentation, Journal
  • Training Center
  • Web-based courses
  • CD-Rom distribution
  • Classroom with 20 workstations, network
    connectivity, A/V support

22
Agenda
  • Background
  • Program Elements
  • ASC Major Shared Resource Center
  • PET Program _at_ ASC

23
ASC PET Component
  • ASC PET team leads or has on-site expertise in
    most Computational Technology Areas
  • Computational Chemistry Materials Science (CCM)
  • Computational Electromagnetics Acoustics (CEA)
  • Computational Electronics Nanoelectronics (CEN)
  • Computational Fluid Dynamics (CFD)
  • Computational Structural Mechanics (CSM)
  • Climate/Weather Ocean Modeling Simulation (CWO)
  • Environmental Quality Modeling Simulation (EQM)
  • Forces Modeling Simulation/C4I (FMS)
  • Integrated Modeling Test (IMT)
  • Signal/Image Processing (SIP)

24
ASC PET Component
  • Program Leadership for
  • Enabling Technologies (Dr. Robert Moorhead)
  • Forces Modeling Simulation (Dr. Dave Pratt )
  • Integrated Modeling Test (Dr. Tilt Thompkins)
  • Signal/Image Processing (Dr. Stan Ahalt)

25
Strategic Goals Enabling Technologies
  • Priorities
  • Robust, highly functional visualization systems
  • Data mining and knowledge discovery algorithms
    libraries
  • Problem solving environments
  • Mesh generation tools automation
  • Concentrate on
  • Visualization systems that
  • exploit heterogeneous distributed computational
    resources
  • ingest enormous data sets
  • are based on features, not raw data
  • Robust data mining algorithms tools
  • Mesh generation tools and automatic mesh
    generators
  • Continuous training new tools methodologies

26
Technology Roadmap
  • Enabling Technologies

Strategic Focus
FY02
FY03
FY04
FY05
FY06
Areas
1.
Distributed
Vis
ET
-
DV
ET017
ET005
2.
Large Scale
Vis
ET
-
LSV
ET011/012
ET017
3.
Feature
-
based
Vis
ET
-
FV
ET011/012
4.
Data Mining
ET005
ET
-
DM
Knowledge Discovery
5.
Problem Solving
ET016
ET
-
PSE
Environments
ET022
6.
Grid Generation
ET
-
MG
ET019
ET003/004
27
FY02 Projects Enabling Technologies
  • Mathematical Algorithmic Issues in Multiphysics
    Couplings
  • Scalable Data Intensive Computing Environment for
    Composite Modeling
  • Data Exploration Analysis for Large Scale
    Simulations
  • Data Management Support for Feature-based
    Exploration of Large Data Sets
  • Gateway Infrastructure Enhancements
  • EnVis and BMRT Distributed High Performance
    Batch Mode Visualization
  • Development of an Integrated Simulation
    Environment
  • GGTK A Geometry Toolkit

28
ASC PET Component
  • Experienced On-site Team
  • CCM Dr. Jean Blaudeau, HPTi
  • CEN Dr. Paul Sotirelis, HPTi
  • CFD Dr. Hugh Thornburg, MSU
  • CSM Dr. Ron Hinrichsen, NCSA
  • SIP Dr. John Nehrbass, OSU
  • FMS/IMT Roger Panton, WTN
  • IMT Jerry Forner, WTN
  • Training/OKC Mason Colbert, OSC

29
Large-scale Visualization of F-18 CFD Data
Impact
  • Users
  • AFRL/Air Vehicles Directorate
  • Cobalt CHSSI team
  • PET Contribution
  • Large data set 6 million cell unstructured mesh
    with 15 GB of solution data
  • Off-screen rendering, multi-threaded algorithms
    for speed
  • Visualization for time-series data
  • DoD Relevance
  • Solved Unsteady Aerodynamics of Aircraft
    Maneuvering at High Angle of Attack

SciVis
30
Development of Materials for Laser Modulation
Impact
  • Users
  • AFRL/Materials and Manufacturing Directorate
  • PET Contribution
  • Perform excited-state calculations to determine
    transition energies (e.g. phosphorescence)
  • DoD Relevance
  • Development of novel nonlinear dyes for photonic
    applications
  • Materials to protect eyes from lasers.
  • Work presented at ACS National meeting, August,
    2001

CCM
31
C-17 Optimization Modeling
Impact
  • User
  • ASC/Engineering Directorate (weapons systems
    logistics analysis)
  • PET Contribution
  • Provide a multi-level, multi-object optimization
    framework
  • Develop, validate and apply the optimization
    system to established AF logistics model
  • DoD Relevance
  • Reduce analysis time from six weeks to 1 day
  • Applicable to other system optimization efforts.

C-17 Globemaster III
FMS
32
Evolved Expendable Launch Vehicle (EELV)
Impact
  • Users
  • Space Missile Systems Center, Los Angeles AFB
  • The Boeing Company
  • Rocketdyne Propulsion and Power
  • Rockwell Science Center
  • PET Contribution
  • Data management and processing tools
  • Assistance with HPC processes and procedures
  • Resolved code porting and visualization issues
  • DoD Relevance
  • Risk mitigation
  • Reduced development costs/times
  • Accurate and timely analysis without excessive
    design conservatism

CFD
33
Summer Intern Program
Impact
  • Summer Interns 2002 - the fifth year of this
    program
  • PET Contribution
  • Provide on-site training for deserving
    undergraduate and graduate computer science and
    engineering students
  • Mentors guide student research on relevant
    problems.
  • Students present end-of-term reports
  • DoD Relevance
  • Create a pipeline of HPC-literate computational
    scientists for DoD

TC
34
Recent Training/Seminars
  • MPI (16-17 Jan)
  • Introductory MATLAB (24-25 Jan)
  • Accelrys Materials Studio (5 Feb)
  • Accelrys Mesoscale (6 Feb)
  • ENVIS Training _at_ Vicksburg, MS (8 Feb)
  • SWITCH ( 7-8 March)
  • Electronic Battlefield Environment Seminar (7-8
    March)
  • GASP Workshop (27 March)

35
Programming Environment Training
  • Building collaborative relationships with
  • Academic partners
  • Industrial participants
  • DoD research community
  • Engaging Minority Serving Institutions (MSIs)
  • Framework for technology transfer to/from
    established HPC centers
  • Training tomorrows DoD workforce to leverage
    advances in HPC technology

36
For Additional Information
  • HPC Modernization Program
  • www.hpcmo.hpc.mil/
  • ASC MSRC
  • www.asc.hpc.mil
  • PET Program
  • www.hpcmo.hpc.mil/Htdocs/PET/
  • Any other information
  • william.zilliox_at_wpafb.af.mil
  • (937)904-5148
Write a Comment
User Comments (0)
About PowerShow.com