Title: The UK Grid Report
1The UK Grid Report
- Paul Jeffreys
- Rutherford Appleton Laboratory and PPARC
- Historical context
- Particle Physics initiatives in the UK
- UK Particle Physics computational grid
initiatives - Grid co-ordination across the UK
- UK Particle Physics grid organisation
- (Networking activities)
- Summary
2Historical Perspective
- 1Q 1999
- Cashmore Panel looks into UK PP programme
beyond 2005 recognised that substantial
resources needed for computing - Based loosely on grid model
- Bid under preparation for LHC computing
infrastructure in UK (to Joint Infrastructure
Fund) but unable to reach agreement on details - 3Q 1999
- Government announces that Spending Review
brought forward by one year, submit Feb/Mar 2000,
announcement Autumn 2000 -- for 2001-3 - JIF LHC bid steered towards case for prototype
grid in the UK - Funding Council (PPARC Jim Sadlier
representative at workshop) recognises
opportunity to bid against SR for computational
grid, as cross-disciplinary exercise - Resources required for Particle Physics grid
defined - based on analysis made for JIF bid - 4Q 1999
- Join CERN co-ordinated European Grid project
Crucially important as impossible for UK to
operate independently - Every institute in UK signs up for JIF bid and
submitted (vitally important as demonstrates
whole community behind initiative)
3Particle Physics Initiatives
- BaBar experiment bid to Joint Research Equipment
Initiative fund - Prepared 1998, early form of grid model
- Awarded 800k (Jan99), of which 650k funded
kit worth 2M - Forced to go out for re-tender SUN won
- Equipment for 10 UK sites, RAL is Regional
Centre - One of three Regional Centres world-wide
- Mutual support between centres
- RAL received
- E4500 processor (6 CPUs), 4GB memory and 4.5TB
(out of 12TB total) disk - Staff post to develop central database system
(just started) - Universities have not had effort to develop local
facilities yet - Effort is again real issue
- LHCb JREI bid for MAP
- Liverpool bid
- Awarded 270k (Jan99), less than requested
- Purchase 300 PCs to operate LINUX farm
- 107 events simulated per week just keep
Ntuples - No opportunity for full interactive analysis
- Running since November 1999
4Particle Physics Initiatives
- CDF/D0 experiments at Fermilab (1999)
- CDF - successful Joint Infrastructure Fund bid
- Submitted Dec98, successful in Dec99
- FNAL for UK physicists 10TB disk, 4SMP
workstations - RAL 5TB disk, 5TB tape, SMP workstation
- (possibly split to be reconsidered)
- (Some provision for MINOS)
- 4Mbps transatlantic connection first funded
connection! - Little at 4 universities
- D0 - successful JREI bid .. Regional Centre at
Lancaster - Lancaster bid (access to Manchester/IC) submitted
May 99 - COMPAQ industrial partner
- 60Gips of PC power, 24TB Exabyte(!) store (5X
less than disk), 1TB - 1 FTE for 2 years
- All facilities at Lancaster, none at FNAL, tape
not disk! - Likely to be successful
5Particle Physic Initiatives
- BaBar - JIF
- Submitted April 99, total value .. 3M
- Line to SLAC at 4Mbps ..1.2M
- Computers to exploit equipment bought under
JREI - Big SMP at RAL (E6500 -8 cpu) .. 0.3M
- Six LINUX simulation farms at Birmingham,
Bristol, Manchester, QMW, Brunel and RHBNC ..
0.4M - Held over hear spring 2000
- Summary
- Real funds for particle physics computing (new
pots of money!) - No national co-ordination possible (in fact
opposite encouraged!) - Good experience already in regional centre for
BaBar, Liverpool MAP, and planning for other
centres - UK Particle Physics grid offers opportunity to
give some coherence across national computing
provision - Same comment as Richard M ? vital that not
perceived as isolated activities to win funds
although clear overhead in co-ordinating
6LHC JIF bid for prototype grid
- LHC experiment bid to JIF
- Prototype Tier 1 centre, located at Rutherford
Appleton Laboratory - Agreed by community that prototype Tier 1 should
be at RAL - Will be notified in Autumn 2000 (possibly
indication earlier) - Funds available across 2001,2,3
- Total bid for 6M
- 3100 PC99, 125 TB disk, 0.3PB tape, 50Mbps
network to Tier 0 - 10 staff at centre, 10 based in universities
(staff crucial component) - Staff need to be sold as software!!
- Prototype Tier 2 centres (Edinburgh, Liverpool)
- Look for close industrial participation
- (Collaboration with other scientific disciplines
discussed below)
7Submission to be made to S.R.
- UK aiming for single Tier 1 centre to support all
4 LHC experiments - Analysis indicates community not large enough to
support gt1 - Tier 1 assumed to support 100 physicists in 4
working groups - Start-up
- Processors 14400 PC99 equivalents
- Disk 296 TB
- Tape 2.0 PB
- Networking (Tier 0) Gbps
- Effort Increasing to 35SY (10 of in
universities) - 9SY experiment specific software and
applications - 9SY middleware
- 17SY lower level, networking, 24x7 operation
- Not hard-wired, we recognise that middleware is
the major challenge - Crucial Point no new funds for grid
specifically until April 2001
8Cross-disciplinary Developments
- Recognised widely (OST) -- national computational
grid developments must be science driven -
E-science - Hence SR bid initiated and co-ordinated by
Research Councils - SR bid titled..
- Information Technology and the UK Science and
Engineering Base - Meeting held 17 January essentially to develop
research IT green-paper - Particle Physics
- BioScience
- Environmental
- Physical Sciences
- Very marked contrast between PP and rest. We have
a well specified problem to a well defined
timescale, which has to be addressed, and
coherent community - Almost embarrassingly good response for PP!
- Meeting encourages
- PP activity on grid as presented - sees PP grid
as prototype for others (avoids question of
combining and diffusing activities) - Bio-science community
- Environmental community ... but recognised more
diffuse (in UK)
9 Cross-disciplinary Developments
- Propose meetings between infrastructure providers
(CLRC/UKERNA/HPC) centres and scientific
disciplines - PWJ asked to co-ordinate PPA with
infrastructure communities - Combined with BioScience 13 March 2000
- Computer Scientists and Industry invited
- Key gathering
- Membership of Wellcome IT Management Team
- PPARC Organising Trip for Companies interested in
Grid to CERN - 9-10 March
- Jim Sadlier
- Links being made through CLRC
- For example .. Computing Science and Engineering
group have links with Bio-Informatics people who
are keen to try out some applications on a grid - Benefit of multi-disciplinary institute
10Particle Physics Grid Co-ordination
- PP A met on 24 January to prepare initial work
schedule - Desire to integrate Astronomy side (as
appropriate) - Key aspect of future plans are to work closely
with CERN and Data-Intensive Computational Grid
team and others (GLOBUS?) - Participate in testbed activities (EU grid
project) - Identify areas in which UK can work which
complements other activities - Evaluate middleware and look to contribute
- Investigate QoS and other network issues
- Next months will be used to define action plan
(as shown for PPDG) - Half Day UK Town Meeting 15 March 2000
- Meeting for UK LHC collaborations and community
- Key-note talk from Richard Mount
- Objective inform PP (and A) community of all
developments, and in particular seek their
support - Recruit people to work on Grid from universities
- Important as rapid developments
11Particle Physics Grid Co-ordination
- CLRC Grid Team formed
- Commissioned 27 January
- Within Particle Physics, squeezed out from
existing activities - IT Department .. Re-direct some effort paid for
by PP community - Computational Science and Engineering support
(area of common interest) - Looking for (3-5 2-3 1-2) SY (intended to be
at least 7SY) - Hoping to recruit effort need some people
working most of their time - Team will build up and naturally move into JIF
era (to level of 1010 staff?) - UK grid management team being formed
- Consist of representatives from Tier 2 possible
centres, and sites representing successful
computing bids, plus CLRC, plus collaboration
representatives, - Oversee activities on national level
- Given that substantial resources are likely to
become available, crucial that managed correctly,
transparently, coherently
12Networking
- UK has had an active interest in networking for
many years - Recently worked closely with Les Cottrell on ICFA
monitoring - Sophisticated network monitoring since c.1996
(used by UKERNA!) - Number of interesting networking developments and
tests - need to commission 4Mbps RAL to Chicago
(non-trivial as transatlantic connection
terminates in London) - hopefully will have funds for 50Mbps connection
to CERN - Tests
- QoS tests from US to UK
- Presently set up from Abilene to JANET (CAR,
WRED) .. Only incoming - But we need to get into Abilene (Les looking into
this) - Will also have ESnet to JANET priority (been
promised since last summer) - Managed Bandwidth to CERN
- Commissioned UKERNA to set up PVC over SuperJanet
and Ten-155 - Should be set up by the summer
- Discover that Japanese link to CERN via London
with MB!! - Managed bandwidth to BaBar university sites from
RAL (JCG) - PVC over ATM, but may be limited to SuperJANET3
13Summary
- Exciting time in the UK
- New pots of money, and potential funds for grid
from 2001 - Support in high places for grid development
- Particle Physics encouraged to blaze the trail
- Very keen to do this, but in close collaboration
with rest of PP community - Responsibility to encourage other disciplines in
process - Intend to establish close partnership with
industry from the beginning - Seen at many levels as
- A great opportunity .. -- and a threat!
14PPARC .. Overview of Near-future Activities
PPARC-Collaboration Link HI-phi -- Meeting FORUM,
more required?
IT Focus for UK R.Community
Links to US PPDG (talk) Globus
IT kit from bids BaBar CDF,...
CERN Co-ordination Framework V
Computer Scientists Tony Hey
LHC Comp. Challenge JIF???
PPARC PLANNING
JT/OST Link SR2000 Work Groups
Industry
Other Disciplines
Tier2 centres Edin, Lpool, ...
LHC Computing Review
UKERNA JIF bid QoS tests to US SJ-4 GEANT
MONARC Contribute
Egrid Other inits
Bio-informatics