Title: GriPhyN and iVDGL Progress
1- GriPhyN and iVDGL Progress
Paul Avery University of Florida http//www.phys.u
fl.edu/avery/ avery_at_phys.ufl.edu
HICB Meeting GGF7, TokyoMarch 3, 2003
2NSF Review of GriPhyN 1/2003
- Major mid-term review Jan. 29-30
- 5 NSF people, 6 reviewers (3 CS, 3 physics)
- High visibility of GriPhyN largest NSF/ITR
program funded - GriPhyN mentioned in many NSF announcements,
reports - Oral report uniformly excellent
- Research, E/O, impact, collaboration
- Door opened for supplemental funding for VDT
- Written report not yet available
- Expect mid-term review of iVDGL late 2002 or
early 2003
3Several VDT Releases in 2002
- Continuing feedback from users
- Testbeds, individuals, EDG, LCG
- Result much simpler installation, better
post-install configuration - VDT 1.1.5 released just before SC2002
- Massive improvements in usability
- Globus 2.0 patches new job manager
- Condor and Condor-G 6.4.3
- GDMP 3.0.7
- Fault Tolerant Shell
- EDGs CRL-Update mkgridmap
- DOE EDG CA Certificates
- UW-Madison team increased by one person
- Alain Roy (GriPhyN)
- Carey Kireyev (iVDGL)
VDT support person
4VDT PACMAN News
- Recent VDT deployments
- US-Atlas Testbed
- US-CMS Testbed
- Non-HEP sites (LIGO, SDSS)
- WorldGrid and all GriPhyN demos at SC2002
- VDT 1.1.7 being readied
- Additional post-install configuration
- Chimera Virtual Data System (1.0)
- Pegasus planner?
- Support for EDG components
- Support for LCG components
5VDT Plans and Challenges
- Integration with NMI for core components
- Globus, Condor
- Timeline for new internal products
- NeST, DRM, Stork and DAGMan-2
- New schedulers, etc.
- Products from external software contributors
- Work out software practices, packaging,
validation, testing - Demands for hardening and support services
- Easy auto-config for non-expert or lite users
- Longevity of VDT
- Short term GriPhyN supplement?
- Long term NSF Cyberinsfrastructure program?
6ATLAS Simulations on iVDGL Resources
Joint project with iVDGL
7US-CMS Grid Testbed
8Grid Middleware (VDT 1.1.3)
- VDT Client
- Globus Toolkit 2.0
- Condor-G 6.4.3
- VDT Server
- Globus Toolkit 2.0
- mkgridmap
- Condor 6.4.3
- ftsh
- GDMP 3.0.7
- Virtual Organization Management
- LDAP Server deployed at Fermilab
- Contains the DNs for all US-CMS Grid Users
- GroupMAN (from PPDG and adapted from EDG) used to
manage the VO - Investigating/evaluting the use of VOMS from the
EDG - Use D.O.E. Science Grid certificates
- Accept EDG and Globus certificates
9US-CMS Grid Testbed Results
- Early CMS testbed (Mid-2002)
- Identified many Globus, Condor bugs and
features - 150K events
- Development Grid Testbed (DGT)
- 40 CPUs
- Integrated Grid Testbed (IGT)
- 400 CPUs
- Production Run for IGT MOP Oct. Dec. 2002
- Assigned 1.5M events for eGamma Bigjets
- Finished after two months 30 CPU-years
10WorldGrid
- Collaboration between US EU grid projects
- Starting from GLUE efforts
- http//www.ivdgl/org/worldgrid/
- Use shared Global resources across experiment
Grid domains
11WorldGrid Toolkit
- Complete environment including software
configuration - GRID middleware
- Resource information service publishing and
registration - VO Monitoring of job execution and site status
- VO management
- Provide uniform environment for applications
- ScienceGrid location for application software
- Workspace on local or dedicated fileserver
- First implementation for 2002 demonstration
projects - Participation by 16 institutions from 7 countries
- Applications CMS, ATLAS, Sloan from US and EU
- Automation of configuration incomplete
- Not production quality
12WorldGrid Details
- Infrastructure development project
- Initially 18 sites, 7 countries 130 CPUs
(varied) - Common information index server Globus, EDG,
GLUE schema - Interoperability components
- EDG schema and IP providers for VDT servers
- UI and JDL for EDT sites
- First steps towards policy instrumentation
monitoring - Submission portals ATLAS-Grappa, EDG-Genius, CMS
MOP master - VO centric grid monitoring Ganglia and
Nagios-based - Packaging
- With Pacman (VDT sites) and RPMs/LCFG (DataTAG
sites) - ScienceGrid
- ATLAS, CMS, SDSS and LIGO application suites
13The WorldGRID Caches
14WorldGrid at SC2002, IST2002
15Recent WorldGrid Deployments
SKC
Boston U
Michigan
Argonne, Chicago
BNL
LBL
Indiana
HU
OU
UNM
KEK (ATLAS)
UF
UTA
SMU
WG resource
UTB
16Future WG Project Interface?
World Grid
help
pause stop
CMS Trigger studies Sloan Sky Survey LIGO Gravity
Wave Search BU Space Weather D0 Top search CDF
all jets analysis
890 cpu
17WorldGrid Development
- WorldGrid cache restructured
- Latest VDT client and server
- Ganglia based monitoring sensors
- Simplified VO mechanism and policy
- ScienceGrid Remove SC applications
- Now used as a mechanism to download projects
- New Pacman tools to manage projects at site
18WorldGrid Development II
- Development projects
- Improved Pacman support for native installations
- Improvement and refinement of architecture
- Complete automation of installation
- VO management, with US CMS
- Site management, user forum
- Chimera/VDT testbeds
- Operations
- RLS Café service for Chimera evaluation, testing
- Production monitoring service
- Project validation service
- Multi-Service Sites
- Several US LHC sites wish to participate in LCG-1
- Create and support LCG-1 Pacman cache and
environment
19HEP Related U.S. Grid Developments
- Dynamic workspaces proposal (ITR program 15M)
- Extend virtual data technologies to global
analysis communities - Collaboratory proposal (ITR program, 4M)
- Develop collaborative tools for global research
- Combination of Grids, communications, sociology,
evaluation, - BTEV proposal (ITR program, 4M)
- Would join iVDGL
- Creation of CHEPREO in Miami area (12M)
- New CMS group, minority E/O, WorldGrid/iVDGL
- International network to Brazil / South America
- Other proposals
- ITR, MRI, NMI, SciDAC, others