Title: The UK eScience Grid and other real Grids
1The UK eScience Grid(and other real Grids)
NIEeS Summer School 2003
2The Grid in the UK
Pilot projects in particle physics, astronomy,
medicine, bioinformatics, environmental
sciences...
Contributing to international Grid software
development efforts
10 regional eScience Centres
3Some UK Grid resources
- Daresbury - loki - 64 proc Alpha cluster
- Manchester - green - 512 proc SGI Origin 3800
- Imperial - saturn - large SMP Sun
- Southampton - iridis - 400 proc.Intel Linux
cluster - Rutherford Appleton Lab - hrothgar - 32 proc
Intel Linux - Cambridge - herschel - 32 proc Intel Linux
cluster - ...
- coming soon 4x gt64 CPU JISC clusters, HPC(X)
4Applications on the UK Grid
Ion diffusion through radiation damaged crystal
structures (Mark Calleja, Earth Sciences,
Cambridge)
- Monte Carlo simulation lots of independent runs
- small input output
- more CPU -gt higher temperatures, better stats
- access to 100 CPUs on the UK Grid
- Condor-G client tool for farming out jobs
5Applications on the UK Grid
GEODISE - Grid Enabled Optimisation Design
Search for Engineering (Simon Cox, Andy Keane,
Hakki Eres, Southampton)
- Genetic algorithm to find the best
- design for satellite truss beams
- Java plugins to MATLAB for remote
- job submission to the Grid
- Used CPU at Belfast, Cambridge, RAL,
- London, Oxford Southampton
6Applications on the UK Grid
Reality Grid (Stephen Pickles, Robin Pinning -
Manchester)
- Fluid dynamics of complex mixtures, e.g
- oil, water and solid particles (mud)
- Used CPU at London, Cambridge
- Remote visualisation using SGI
- Onyx in Manchester (from a laptop
- in Sheffield)
- Computational steering
7Applications on the UK Grid
GENIE - Grid Enabled Integrated Earth system
model (Steven Newhouse, Murtaza Gulamali -
Imperial)
- Ocean-atmosphere modelling
- How does moisture transport from the
- atmosphere effect ocean circulation?
- 1000 independent 4000year runs
- (3 days real time!) on 200 CPUs
- Flocked condor pools at London Southampton
- Coupled modelling
8Two years to get this far...
July 2001 - Regional eScience Centres
funded October 2001 - First meeting of the Grid
Engineering Taskforce (biweekly meetings using
Access Grid) August 2002 - Level 1 Grid
operational (simple job submission possible
between sites) April 2003 - Level 2 Grid
applications (security, monitoring,
accounting) July 2003 - Level 3 Grid
more users, more robust
9The European DataGrid
- Tiered structure Tier0CERN
- Lots of their own Grid software
- Applications particle physics,
- earth observation, bioinformatics
- http//www.eu-datagrid.org/
10NASA Information PowerGrid
- First production quality Grid
- Linking NASA academic
- supercomputing sites at 10 sites
- Applications computational fluid
- dynamics, meteorological data
- mining, Grid benchmarking
- http//www.ipg.nasa.gov/
11TeraGrid
- Linking supercomputers
- through a high-speed network
- 4x 10GBps between SDSC,
- Caltech, Argonne NCSA
- Call for proposals out for
- applications users
- http//www.teragrid.org/
12Asia-Pacific Grid
- No central source of funding
- Informal, bottom-up approach
- Lots of experiments on
- benchmarking bio apps.
- http//www.apgrid.org/
13What does it take to build a Grid?
- Resources - CPU, network, storage
- People - sysadmins, application developers, Grid
experts - Grid Middleware - Globus, Condor, Unicore
- Security - so you want to use my computer?
- Maintenance - ongoing monitoring, upgrades and
- co-ordination of this between multiple sites
- Applications and users!
14How you can get involved...
- NIEeS
- National eScience Centre (Edinburgh)
- http//www.nesc.ac.uk/
- NERC PhD studentships
-
- Your local eScience Centre
- Adopt an application!
15 Questions?