HPWREN and International Collaborations - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

HPWREN and International Collaborations

Description:

Grid applications, practical issues. Build international scientific collaborations ... 3 applications ran in PRAGMA grid and 1 will run in GIN testbed (multi-grids) ... – PowerPoint PPT presentation

Number of Views:126
Avg rating:3.0/5.0
Slides: 28
Provided by: peter1205
Category:

less

Transcript and Presenter's Notes

Title: HPWREN and International Collaborations


1
PRAGMA Grid A Multi-Application Route-Use Global
Grid
Cindy Zheng PRAGMA Grid Coordinator Pacific Rim
Application and Grid Middleware
Assembly University of California, San Diego San
Diego Supercomputer Center http//www.pragma-grid
.net http//goc.pragma-grid.net
2
Overview
  • PRAGMA
  • Goals, Characteristics, Working groups, Workshops
  • PRAGMA Grid testbed
  • Goals, Characteristics, Resources
  • Applications
  • Application middleware
  • Infrastructure middleware
  • Benefit summary
  • Multi-Grid interoperation
  • Goals, Resources
  • Applications
  • Infrastructure testing matrix
  • Lessons learned
  • Forward

3
PRAGMAhttp//www.pragma-grid.net
  • 2002 -
  • Goals
  • Open international organization
  • Grid applications, practical issues
  • Build international scientific collaborations
  • Members and community
  • 28 institutional members, 11 countries
  • gt38 inst. gt14 countries are actively involved
  • Characteristics
  • No central funding, but mutual interests
  • Build friendship, trust, help, community
  • Do, act
  • Working groups
  • Bio, data, resources, telescience, geosciences,
  • Semi-annual Workshops

4
The PRAGMA Steering Committee
http//www.pragma-grid.net/steering_committee.htm
5
Pragma Grid testbed and Routine-basis Experiements
  • 2004 -
  • Goal make grid easier for scientists to use, by
    improving
  • middleware interoperability
  • Global grid usability and productivity
  • Method
  • For applications. Let applications drive
  • More organized testbed operation
  • Full-scale and integrated testing/research
  • Long application runs
  • Learn issues, develop/research/test solutions
  • Manners
  • Grass-roots
  • Voluntary, contribute of resources and work
  • long term, persistent
  • Inclusion dont have to be PRAGMA member or
    pacific rim
  • General science grid

6
PRAGMA Grid Resources
7
PRAGMA Grid Testbed
JLU China
UZurich Switzerland
BU USA
CNIC GUCAS China
KISTI Korea
SDSC USA
NCSA USA
AIST OSAKAU TITECH Japan
KU NECTEC Thailand
UMC USA
ASCC NCHC Taiwan
CICESE Mexico
UoHyd India
IOIT-HCM Vietnam
UNAM Mexico
MIMOS USM Malaysia
QUT Australia
BII IHPC NGO Singapore
UChile Chile
MU Australia
8
PRAGMA Grid Software Layers
9
Applications http//goc.pragma-grid.net
  • Real science, multi-applications (11)
  • TDDFT quantum-chemistry, AIST, Japan
  • Savannah climate model, MU, Australia
  • MM5 climate model, CICESE, Mexico
  • QM-MD, FMO quantum-mechanics, AIST, Japan
  • iGAP genomics, UCSD, USA
  • HPM genomics, IOIT-HCM, Vietnam
  • mpiBlast genomics, ASCC, Taiwan
  • Gamess-APBS organic chemistry, UZurich,
    Switzerland
  • Siesta molecular simulation, UZurich,
    Switzerland
  • Amber molecular simulation, USM, Malaysia
  • Learn
  • How to grid-enable, run
  • Application needs, issues

10
Grid Application Middleware
  • Why grid application middleware
  • Enable applications to run on grid(s)
  • Make easier
  • Example grid application middleware
    development/testing
  • Ninf-G (AIST, Japan)
  • Nimrod/G (Monash University, Australia)
  • Mpich-Gx (KISTI, Korea)

11
Ninf-G http//ninf.apgrid.org
  • Developed by AIST, Japan
  • Support GridRPC model which will be a GGF
    standard
  • Integrated to NMI release 8 (first non-US
    software in NMI)
  • Ninf roll for Rocks 4.x is also available
  • 3 applications ran in PRAGMA grid and 1 ran on
    GIN testbed (multi-grids)
  • TDDFT
  • QM/MD
  • FMO
  • Achieved long runs (1 week 50 days)
  • Improved fault-tolerance (papers) - hang
  • Simplified deployment procedures
  • Speed-up development cycles

12
Nimrod/G http//ninf.apgrid.org
  • Developed by Monash University, Australia
  • Supports large scale parameter sweeps on Grid
    infrastructure
  • Easy user interface - portal
  • 3 applications ran in PRAGMA grid and 1 will run
    in GIN testbed (multi-grids)
  • Savanah climate simulation (MU)
  • GAMESS/APBS (UZurich)
  • Siesta (UZurich)
  • Achieved long runs (90 different scenarios of 6
    weeks each
  • Improved fault-tolerance (innovate time_step)
  • Speed-up enhancements

13
Mpich-Gxhttp//www.moredream.org/mpich.htm
  • KISTI, Korea
  • Grid-enabled MPI, support
  • Private IP
  • Fault tolerance
  • Application run on KGrid
  • MM5
  • Climate simulation
  • CICESE, Mexico

14
Infrastructure Middleware
  • Why grid infrastructure middleware
  • Provide grid services
  • Make grid easier to use and manage
  • Example grid infrastructure middleware
  • Grid file system
  • Gfarm (AIST, Japan)
  • Grid monitoring system
  • SCMSWeb (Kasetsart University, Thailand)
  • Grid accounting system
  • MOGAS (Nanyang Technological University,
    Singapore)

15
Gfarm Grid Virtual File Systemhttp//datafarm.a
pgrid.org/
  • Developed by AIST, Japan
  • High performance, scalable grid file system
  • Support Linux, Solaris also scp, gridftp, SMB
  • Meta-server, file replication, Gfarm-FUSE
  • Ease user/application setup, file sharing (CSA),
    fault tolerance
  • 6 sites, 3786 GBytes, 1527 MB/sec (70 I/O nodes)
  • Tested with iGAP, large number of files,
    performance up gt10x

16
SCMSWebhttp//www.opensce.org/components/SCMSWeb
  • Developed by Kasetsart University, Thailand
  • Web-based monitoring system for clusters and grid
  • System usage, functional/performance metrics,
    job/queue status
  • Easy user interface, rapid support
  • Testing in PRAGMA grid, get user feedbacks and
    sites help
  • Speed-up development, enhancement, platform
    support expansion
  • Improve fault tolerance, functionalities, user
    interface
  • Popularize the software
  • GIN, common schema

17
Multi-organisation Grid Accounting
Systemhttp//ntu-cg.ntu.edu.sg/pragma
  • Developed by NTU, Singapore
  • Resource usage based on project/individual/organi
    zation
  • Daily, weekly, monthly, yearly
  • Pie charts and detail job logs
  • Metering and charging tool, easily customizable
    pricing system
  • Database and data analysis tools
  • Run on 17 sites in PRAGMA testbed
  • Improved interface to various globus and local
    schedulers

18
PRAGMA Grid Brings Together
  • People
  • Hands-on learning to make grid work
  • Software development
  • Heterogeneity and reality check
  • Check software Integration and interface both
    horizontally and vertically
  • User feedback steers better direction
  • Popularize grid software
  • Collaborations
  • Naregi-CA (AIST, Japan) and Gama (SDSC, USA)
    Integration
  • Rocks (SDSC, USA) and SCE (KU, Thailand), Ninf-G
    (AIST, Japan), Gfarm (AIST), KRocks (KISTI,
    Korea)
  • PRAGMA and NLANR
  • PRAGMA and GEON
  • PRAGMA testbed and sensor networks (NCHC, Taiwan
    Binghamton University, USA)

GAMA
19
Grid Interoperation Now (GIN)http//goc.pragma-gr
id.net/gin/default.html
  • GIN testbed (started Feb. 2006)
  • PRAGMA, TeraGrid, EGEE, OSG,
  • NorduGrid
  • Applications
  • TDDFT/Ninf-G
  • Lead Yoshio Tanaka, Yusuke Tanimura (AIST,
    Japan)
  • Deployed and run
  • PRAGMA - AIST, NCSA, SDSC
  • TeraGrid ANL
  • Working on deployment to EGEE, OSG and NorduGrid
  • Savanah Study
  • Lead Colin Enticott (MU, Australia)
  • Infrastructure testing matrix
  • Cindy Zheng (SDSC, USA) and Somsak
    Sriprayoonsakul (KU, Thailand)
  • Use SCMSWeb

20
Lessons Learned
  • Differences among grids
  • Organization structures
  • Authentication (GSI, VOMS)
  • Job submission (GRAM, Gridftp)
  • Software stacks
  • Takes a lot of learning to understand
  • Takes a lot of work to interoperate
  • Resolved some problems (GSI-VOM, GT2-GT4)
  • Stimulated new initiatives (Cross grid
    monitoring)
  • Learned some from each other (Community Software
    Area)
  • Many more still yet to work on (File sharing,
    easy user access, direction and standards)

21
Forward
  • More resources
  • More computational resources
  • GUCAS, China UMC, USA
  • Add data resources (geo, atro, bio, )
  • Add sensor network resources
  • NCHC, Taiwan BU, USA
  • More applications
  • Geoscience (Mian Liu, Huai Zhang)
  • More and better grid middleware
  • Credential management systems (SDSC, USA Naregi,
    Japan)
  • Portals (NCHC, Taiwan QUT, Australia )
  • Meta-schedulers (AIST, Japan IHPC, Singapore )
  • More grids interoperation
  • More grids
  • More applications
  • More collaboration on grid technology research
    and development

22
One Possible Collaboration with AIST
23
Collaborations with GSCAS and CNIC
  • GSCAS (Shi Yaolin, Huai Zhang), U Missouri (Mian
    Liu), UCSD (Chaitan Baru, Cindy Zheng), CNIC (Kai
    Nan)
  • Develop PRAGMA/iGEON Node at GSCAS
  • Develop initial parallel finite element codes
  • Run a geodynamics models on this cluster and then
    the PRAGMA testbed
  • Ensure user-friendly, Web interfaces to access
    and execute finite element codes developed by
    China on PRAGMA grid as well as TeraGrid

24
Collaborations with U Hyderabad
  • Collaborators A. Agarwal, K.V. Subbarao (U
    Hyderabad) and Chaitan Baru (UCSD)
  • Establish GEON node at U Hyderabad
  • Experiment with sharing data
  • Register new datasets

25
Other Collaborations
  • Exchanges
  • Calit2
  • Students (PRIME)
  • GEON willing to support this
  • OptIPuter

26
New Paradigm Global Team Science
Kangwon U B.Kim Maintain Soyang Public Policy
U.Wisconsin T.Kratz Maintain Trout Bog Lake
Metabolism
NCHC F.P.Lin Maintain YYL Parallelize Codes
UCSD F.Vernon, S.Peltier, T.Fountain
P.Arzberger ROADNet, Telescience Moore Fnd, PRAGMA
NIGLAS B.Q Qin Maintain Taihu Physical Limnology
U.Waikato D.Hamilton Models
27
Thank You
Everyone is welcome to join us! Either to build
grid and/or to run applications.
zhengc_at_sdsc.edu http//goc.pragma-grid.net
Write a Comment
User Comments (0)
About PowerShow.com