Building a Broad Base for CI NSF EPSCoR National Workshop on Cyberinfrastructure Nashville, Tennesse - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Building a Broad Base for CI NSF EPSCoR National Workshop on Cyberinfrastructure Nashville, Tennesse

Description:

Building a Broad Base for CI NSF EPSCoR National Workshop on Cyberinfrastructure Nashville, Tennesse – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 33
Provided by: jenn186
Category:

less

Transcript and Presenter's Notes

Title: Building a Broad Base for CI NSF EPSCoR National Workshop on Cyberinfrastructure Nashville, Tennesse


1
Building a Broad Base for CINSF EPSCoR National
Workshop on CyberinfrastructureNashville,
TennesseeFrank WilliamsArctic Region
Supercomputing CenterMay 11, 2006
2
  • Who we are
  • High performance computing center
  • University owned and operated
  • DoD funded through HPCMP
  • Only open systems DoD Center
  • What we do
  • Provide HPC resources and support
  • Computing, Data Storage, Visualization,
    Networking
  • Conduct research locally and through
    collaborations

3
Research Institutes at the University of Alaska
Fairbanks
Institute of Northern Engineering
Institute of Marine Science
Geophysical Institute
International Arctic Research Center
Institute of Arctic Biology
4
UAF Research Facilities
Robert G. White Large Animal Research Station
Alaska SAR Facility
Poker Flat Research Range
Museum of the North
5
Large Compute Systems
  • Iceberg
  • IBM P690/p655
  • Two p690 servers each with 32 processor
  • 256 GB memory per server
  • 92 p655 servers each with 8 processors
  • 16 GB memory per server
  • 5 TFLOPS
  • 25 terabytes disk storage
  • Federation interconnect
  • Klondike
  • Cray X1
  • 128 multi-streaming processors
  • 512 processors
  • 512 GB memory
  • 1.6 TFLOPS
  • 21 TB disk

Iceflyer IBM Regatta 32 1.7 GHz Power4 CPUs in
24-way SMP node 7-way interactive node 1 test
node 256 GB Memory
IBM Power5 Four 8-CPU IBM p575 nodes with 1.9
GHz Power5 processors 16 GB Memory each
6
Dedicated/Adaptive Systems
  • Nelchina.arsc.edu (Cray XD1)
  • 12 nodes with 2- AMD Opteron 250 processors
    (single core)
  • (2 cores per node)
  • 6 nodes with 2- AMD Opteron 250 processors
    (single core) and 1- Field Programmable Gate
    Arrays (FPGAs) which acts as coprocessor to the
    Opteron processors. (2 cores per node)
  • 18 nodes with 2- AMD Opteron 280 processors (dual
    core) (4 cores per node)
  • Cerebro.arsc.edu (SunPC Rocks cluster)
  • 6 1U Sun PCs each with dual Opteron 250
  • 4GBs ram
  • 1 146GB 10k RPM U320 SCSI disk
  • 2 Sun 3511 fiber channel disk arrays (4.8TBs
    total)
  • Gigabit Ethernet interconnect
  • Rocks Cluster 3.3.0 software based on Red Hat
    Enterprise Linux
  • Intended use small compute cluster
  • IBM Image Generator
  • 7 nodes, each with (2) Xeon processors at 3.06Ghz
  • 4GB memory
  • 146GB local disk
  • High-end video (Quadro FX3500G)
  • Gigabit Ethernet networking
  • All 7 cluster nodes can dual boot Linux (Fedora
    Core 3) or Windows (XP Professional)

7
Storage
  • STK robotic tape library holds 6500 tapes with a
    total theoretical capacity of 3.25 petabytes
  • T10000 tape drives hold 500GB/tape at 120MB/s
  • T9940B tape drives hold 200GB at 30MB/s
  • Dedicated storage systems provide stable,
    reliable storage as HPC platforms come and go
  • Twin Sun Fire 6800 systems, one dedicated to the
    DoD HPC users, one dedicated to the non-DoD
    users. Each has clustered Sun T2000 systems for
    redundant paths and higher speed transferts.
  • Each Sun Fire runs SAM-QFS, which provides
    high-performance disk storage transparently
    backed by offline tape storage.
  • Large SATA disk arrays, which keep the most
    recently accessed files highly available for fast
    recall.
  • Consistent configuration and usage practices make
    transitioning between HPC platforms easier for
    users.
  • STK robotic tape library with over 3 petabytes
    theoretical capacity provide automated access to
    tapes for archival and backup storage of user
    data.

8
Discovery Lab
  • Fakespaces MD Flex re-configurable,
    large-screen virtual environment.
  • Performance Center for CAVE/CAVE collaboration
  • First Fridays
  • Art Gallery
  • Alaska Panoramas
  • Virtual Machine Room Tour
  • Tsunami
  • Space Weather
  • Permafrost


9
Access GridCollaborative Environment
  • Education Shared classes with other universities
    (University of Montana, University of New Mexico,
    Montana State)
  • Advanced Cellular BioChemistry
  • Research Collaborative art and music events
  • Meeting facilitation
  • Multicast 3 to 15 Mb/s depending on number of
    remote sites.

10
Virtual Tsunami Center
  • Research, Zygmunt Kowalik
  • Operational, Roger Hansen
  • Computational Portal, Barbara Horner-Miller
  • Visualizations of data produced by the Tsunami
    Portal Project provide a better view of how
    tsunamis affect local communities by giving
    researchers access to multiple codes for modeling
    generation, propagation and run-up of tsunamis on
    multiple computer platforms.

11
Tsunami Portal
Virtual Tsunami Center
  • Cherri Pancake Oregon State University
  • Barbara Horner-Miller ARSC
  • Tom Logan ARSC
  • UAF Tsunami- Developed by researchers at UAF
    (Elena Suliemani, Zygmunt Kowalik)
  • Solves non-linear shallow water equations using
    velocity
  • COMCOT- Developed by researchers at Cornell
    (Xiaoming Wang, Philip Lui)
  • Solves non-linear shallow water equations using
    flux
  • Tsunami_CLAW - Developed by researchers at U.
    Washington (David George, Randy LeVeque)
  • Solves non-linear shallow water equations using
    momenta with adaptive mesh refinement techniques

12
Coastal Oceanography
  • Kate Hedstrom and Mark Johnson, leads
  • In cooperation with the Alaska Ocean Observing
    System, Dr. Kate Hedstrom (ARSC) is using
    community models to develop understanding of
    ocean/ice circulation and ecosystem models with
    higher spatial resolution and accounting for more
    natural boundary conditions.

Sea ice concentration ROMS model
Sea ice concentration SSM/I
13
Climate/Earth Systems
  • John Walsh, Chief Scientist, IARC
  • Jing Zhang, research associate
  • John Walsh (Presidents Professor, Global Climate
    Change) and his colleagues have established
    ongoing investigations of the role of arctic
    ice-ocean-atmosphere processes in the global
    climate.

14
Ionosphere
  • Brenton Watkins and Sergei Maurits, leaders
  • Dr. Sergei Maurits (ARSC) and Professor Brenton
    Watkins (GI, Physics) are advancing their world
    class Eulerian Parallel Polar Ionosphere Model
    (UAF EPPIM) to significantly improve the models
    capability to simulate the ionospheric
    environment and to predict real time
    scintillation, bending, and abnormal propagation
    of electromagnetic signals in polar regions.

15
Geospatial Information Scienceand Remote Sensing
  • Buck Sharpton and Greg Newby, leaders
  • Oceanography, Weather and Climate, Data
  • Volcanology
  • Work with Alaska Volcano Observatory
  • Puff model multi-scale, multi-input
    visualization
  • GINA
  • Swathviewer and storage for LANDSAT and MODIS
    satelite imagery archives
  • Virtual Globes
  • Work with UAF Institute of Northern Engineering
  • Earthslot and GoogleEarth

16
AEIC Seismic Network
Alaska Earthquake Information Center
  • 12,000 earthquakes/ year

400 seismographs in Alaska
17
Observations to Information
  • Geophysical Institute

18
ARSC OC12 to DREN UA OC3 to I2 DS3 to Internet
T1 connectivity to 16 rural sites across Alaska
19
Alaska Fiber Backbone
Connectivity to Lower 48 - Spur to Juneau Two
primary fiber carriers GCI and WCI Cable
Deadhorse
ACS
Fairbanks
GCI
WCI Cable
20
Cyberinfrastructure training, education,
advancements and mentoring for our 21st century
workforce (CI-TEAM) (Tier 4)
Petascale acquisition (Tier 1)
High Performance Computing System Acquisition
Towards a Petascale Computing Environment for
Science and Engineering (Tier 2)
  • States and Universities own investment
  • (Tier 3)

21
NSFs CI Investment/ Return
Research Staff Professionals
Projects
Tier 1 200mm 100 100 10 1 center
Tier 2 30mm/yr 30/cntr 240 160 2
centers/yr 4 years
Tier 3 113mm over 3 yrs 10/ cntr 500
1000 50 centers 20/ cntr 1000
  • CI-TEAM 107/(104/Intern) 1000 UG
    Students 1000

22
ARSC Summer Interns
23
ARSC Undergraduate Research Challenge 2006
  • Dr. Anton Kulchitsky, Arctic Region
    Supercomputing Center
  • Improvement of Electric Potential Function in
    the UAF EPPIM
  • Interns Mark Wellons, The College of Wooster
  • John Wright, Mercer University
  • Dr. Scott Deal, UAF Music Department
  • Developing Sound in Virtual Reality
    Environments
  • Intern Sean Waite, Lycoming College
  • Dr. Jenny Hutchings, International Arctic
    Research Center
  • Sea Ice Deformation in the Weddell Sea
  • Intern Jennifer Hafer-Zdral, Reed College
  • Dr. Greg Newby, Arctic Region Supercomputing
    Center
  • Programming with Field Programmable Gate Arrays
  • Intern Keven Woo, University of California Long
    Beach
  • Kylie McCormick, Mount Holyoke College

Dr. Boris Bracio, UAF Electrical and Computer
Engineering Biofeedback and Psychology Studies
Using Virtual Reality Interns Carlos Natividad,
University of Texas at El Paso Kailah Davis,
University of the Virgin Islands Dr. Javier
Fochesatto, International Arctic Research
Center Design of an Aerodynamic Lens using
Fluent Intern Stacey Schmidt, University of
Wisconsin Platteville Dr. Jing Zhang, UAF
Mechanical Engineering Finite Element
Simulations of Compaction of Ice
Particles Intern Matthew Poland, University of
California Long Beach
24
ARSC Student Assistants
25
ARSC Post Doctoral Fellows
Anton Kulchitsky Development of the Polar
Eulerian Ionospheric Model
Daniel Pringle Modeling Sea Ice Transport
Properties
Georgina Gibson Ecosystem Modeling in the South
East Bering Sea
John Bailey Developing Automated Alarms of
Volcanic Eruption Activity from Remote Sensing
Data Peter Webley Evaluation of a Volcanic Ash
Tracking Model
John Chappelow How Cyclical Variations in
Orbital Elements Induce Climactic Change, with an
Emphasis on the Study of Mars
26
ARSC Faculty Camp
  • Three-week summer program brings UA Faculty and
    other collaborators together with ARSC
    specialists. Provides opportunity for attendees
    to focus on developing skills they can apply to
    their research.
  • Formal lectures to meet identified needs of
    attendees.
  • Faculty working close with ARSC specialists and
    joint faculty.
  • Independent work time for attendees.
  • Allows time to develop skills, learn about ARSC
    resources and establish lasting collaborations.

27
Training Outreach to UAF Campus
  • ARSC Short Courses
  • Introduction to ARSC, Kate Hedstrom 3/29/05
  • Introduction to Unix, Lee Higbie 4/05/05
  • Introduction to IDL, Sergei Maurits 4/26/05
  • Unix Basics, Lee Higbie (for Interns) 6/16 -
    6/17/05
  • Introduction to IDL, Sergei Maurits (for
    Interns) 6/23/05
  • ROMS, Kate Hedstrom 7/20/05
  • Emacs Series, Anton Kulchitsky 1/20, 1/27,
    2/03/06
  • Introduction to MATLAB 3/2/06
  • Overview of Online 3D Terrain Visualization and
    GIS tools, Matt Nolan 3/1
  • Workshop of Online 3D Terrain Visualization and
    GIS tools, Matt Nolan 3/27,3/29,3/31/06
  • HPC Vendor Courses
  • Barry Bolding, IBM ACTC 2/16-2/18/05
  • James Schwarzmeier, CRAY 3/30-4/01/05
  • Simone Sbaraglia, IBM ACTC 4/18-4/20/05

Software Vendor Courses Matlab Fundamentals
8/12/05 Final Cut Pro 10/11-10/13/05 Matlab
Fundamentals/Advanced Matlab Programming
11/03-11/05/05 Integrating Matlab with External
Applications 3/3-3/4/06 Statistical Methods in
Matlab 4/28/06 Introduction to IDL
4/28/06 Final Cut Pro 4/12-14/06 Core Skills
in Computational Science Fall 2005 PHYS 693
28
(No Transcript)
29
Internet 2 Abilene Backbone Map
30
ARSC Security Infrastructure
  • ARSC staff and users - set the culture
  • ISSO for each distinct architecture/system type
  • PASO works with UA Statewide, UAF Police
    Department
  • ISSMOverall ARSC security management
  • DAAOverall ARSC management

31
A DoD HPCMP Allocated Distributed Center
  • Provides significant resources for HPCMP
  • Knits a cultural interface
  • UniversityDoD
  • Advantages ARSC and UA users
  • Advantages DoD

32
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com