Title: OSG Integration
1OSG Consortium Meeting 21st August University of
Washingon Seattle
Ruth Pordes Executive Director University of
Washingon Seattle
2Thank you to our hosts.
3The Context
- OSG has few face to face meetings. The main ones
are the semi-annual Consortium meetings. - So we use this meeting to
- Review what has happened over the past six
months. - Hear from our contributors and partners on some
aspects of mutual interest. - Share and communicate our goals and plans.
- Move the technical program of the OSG forward.
- The goal is that we all listen, discuss, complain
and construct.
Goal is to make the OSG the A Grid.
4This week we will..
- Hear from our Contributors about the Science from
current use of the OSG. We will Engage with new
communities and partners. We will hear from our
partners about Campus and peer Grid
Infrastructures. - Get updates of the status of OSG, the Facility
and Education programs We will cover much about
Security --- Risk, Responsibility, Tracking for
OSG itself, Virtual Organizations, Sites and
interfaces to peer grids ---. We will have
sessions where people share their experiences and
learn from each other --- Site, Facility and VO
Administrators, Users etc. We will have a few
Demonstrations tomorrow. - Discuss and decide on aspects of the short term
Technical Program - Make decisions for the OSG 0.6.0 software
release and associated VDT releases. - Plan the next Steps in Information Services,
Information Management, Data Management, Workload
Management, Education, Inclusion, Communication.
5First the recap..
6What we have not gotten done since the last
Consortium meeting
- Site Space management so that VOs and Sites have
robust, managed shared storage and space
services. - Simple metrics of use and accounting not deployed
uniformly across sites. - Fixed severe lack of robustness of Authz
components (especially GUMS) and simplicity of
use. - VOs knowing and negotiating with Sites to really
support their applications. - Changed the perception that OSG is mainly
physics. Why is this? GADU only has had one
major run suite of math jobs from Football pool
problem Nanohub jobs undergoing troubleshooting.
-
7What we have Accomplished
- Sustained and operated the OSG to the benefit of
gt15 user organizations. (during ramp down of
funded Grid projects and uncertainty in future). - Increased the robustness and scalability of the
Compute Element head node by implementing a
managed-fork queue and nfs-less shared disk
areas. - Made effective contributions to WLCG (grid and
application) Service Challenges as the US common
infrastructure for ATLAS and CMS - Released 3 versions of the Virtual Data Toolkit
and released OSG 0.4.1. - Increased the average number of jobs on OSG by
1/3 (2000-gt 3000) and the number of Sites -
Compute and Storage - by a few . Increased LIGO
and STAR jobs a few. - Written a comprehensive Security Risk Assesment
of the OSG and an initial security plan. - Achieved Federation of local-area Campus Grids
bridging to wide-area cyber-infrastrastructures. - Run the 4th successful Summer Grid Workshop with
gt45 students who actually adapted their own
applications to run across multiple remote sites. - Released OSG RA.
832 Virtual Organizations - participating Groups
- 3 with gt1000 jobs max.
- (all particle physics)
- 3 with 500-1000 max.
- (all outside physics)
- 5 with 100-500 max
- (particle, nuclear, and astro physics)
9GridCat continues to list the Resources
10More Documentation
11And..oh yes..
- The OSG proposal was finished and submitted to
DOE and NSF in March February. In June funding
the agencies asked for revised budget with more
than a 25 reduction to 6M/year 33 FTE/year
of effort. We are going ahead with our planning
on the assumption that funding is coming in the
next few months. - The project will be accountable in its promises
and deliverables to the funding sources we
expect to be reviewed on our project plan and
milestones 3 months after funding. We will have
Expectations of the project, which will be
documented, tracked and planned across the funded
effort.
For the next 5 years we must be serious about
delivering the fundamental Production Quality
Facility for our Users.
12the OSG Projectand its part in the OSG Consortium
13The OSG Project will deliver
- Maintainenance of an expanding Production
Distributed Facility which provides core
capabilities in Operations, Security, Software
Releases, the Virtual Data Toolkit,
Troublshooting and supports the Engagement of new
communities to use the OSG. - Education, Training and Outreach with a core
focus on expanding the number, location and scope
of the Grid Schools and web based training. - User Support, Service and System Extensions for
increased scale, performance and capability of
the common services for the stakeholder VOs. - OSG Staff executive director, project
consultant, administrative help. Contributions to
International Science Grid This Week. And
Technical documentation for training system
administrators, users and VO administrators.
14OSG Project Execution Plan (PEP) - FTEs
15OSG PEP - Organization
16Part of the OSG Consortium
Contributors
Project
17OSG PEP - High Level Milestones
2006Q3 Release OSG software stack version 0.6.0
2006Q3 Project baseline review
2006Q4 Sign off on OSG Security Plan.
2006Q4 Meet operational metrics for 2006.
2007Q1 Accounting reports available for users and resource owners.
2007Q2 Production use of OSG by one additional science community.
2007Q2 OSG-TeraGrid software releases based on same NMI software base.
2007Q2 Release OSG software version 0.8.0 Complete extensions for LHC data taking.
2007Q2 Support for ATLAS and CMS data taking.
2007Q3 1 year Project Review.
2007Q4 Meet 2007 deliverables as defined by science stakeholders.
2007Q4 Meet operational metrics for 2007.
2007Q4 Release OSG software version 1.0
2008Q2 Production use of OSG by 2 additional science communities.
2008Q3 OSG-TeraGrid production service interoperation.
2008Q3 2nd year Project Review.
2008Q4 Meet 2008 deliverables as defined by science stakeholders.
2008Q4 Meet operational metrics for 2008.
2009Q2 Support for all STAR analysis (10,000 jobs/day).
2010Q1 Support for data taking with order of magnitude increase in LIGO sensitivity.
18OSG PEP - Security, Safety, Risk Management
The OSG Facility assesses, monitors and responds
to security issues. The Security Officer
coordinates these. Each site, user and
administrator has responsibility for local
security and reporting incidents that may occur.
The OSG will have a comprehensive security plan
modeled on the NIST process. While Environment,
Safety and Health (ESH) remains the
responsibility of the owners of the resources
made accessible to the Open Science Grid, the
project organization will pay attention to ESH
issues during travel, meetings and OSG activities
19OSG Project Effort Distribution Year 1
Each Institution will have a signed Statement of
Work (MOU). Each individual will submit open
monthly written reports. Finance Board will
review the accounts and deliverables. Executive
Board will review the plans and
achievements. Activities will be covered by the
Project Plan and WBS. Effort distribution will
be reviewed and potentially modified each year.
20Must Support LHC and LIGO Scaling circa.
2008-2009
Data distribution must routinely exceed 1 GB/Sec
at 10-20 sites Workflows must support gt10,000
batch jobs per client Jobs/Day must exceed
20,000 per VO with gt99 success rate. Accessible
Storage gt10PB. Facility Availability/Uptime
must be gt99.x with no single points of failure.
21Year 1 OSG WBS and the Plans
Bakul Banerjee - Project Consultant, working with
all coordinators to complete plans and
schedules. Plan for initial review at Thursdays
Council meeting. Ready for review by Science
Advisory Council and/or external reviewers in 3
months.
22Operations, Security, Troubleshooting, Software
Expect to Re-Plan. All plans allow for
unanticipated opportunities. Make Plans Help not
hinder.
23Continued focus on OSG Core Competencies
- Integration Software, Systems, Virtual
Organizations. - Operations Common Support Grid Services.
- Inter-Operation Bridging Administrative
Technical Boundaries.
with Validation, Verification and Diagnosis at
each step.
with Integrated Security Operations and
Management.
24Reminder of the S/W Stack and Deployment
Life-Cycle
OSG project funding for VDT will enable more
storage and data management services to be
included in the future.
VDT increases the effectiveness of Condor and
Globus by integrating them with the other
services needed for fully functional
Cyber-environments.
25OSG Project Does Not Do Software Development
Applications
Condor, Globus, EGEE-JRA1, dCache, SRM, US LHC
SC, LIGO PIF, Security for Open Science Center
for Distributed Science etc.
OSG Facility Provisioning VDT, Integration,
Validation, System Integration Testbed
Ready
Release
OSG Facility Operation Operations,
Maintenance Engagement Support
261 2 3 - Join OSG
Core Operations and Common Support
1. VO Registers with with Operations Center.User
registers with VO. 2. Sites Register with the
Operations Center. 3. VOs and Sites provide
Support Center Contact and join Operations groups.
The OSG VO
1. A VO for individual researchers small groups.
2. Managed by the OSG itself. 3. Where one can
learn how to use the Grid!
27Grid of Grids - from local to global
Science Community Grid e.g LIGO, STAR, D0
National International Infrastructures for
Science e.g. Teragrid, EGEE, NAREGI
Campus Regional Infrastructures e.g. CrimsonGr
id, GLOW, NWICG
CS/IT Campus Grids
28Grid of Grids - OSG is one grid among many
Science Community Grid e.g LIGO, STAR, D0
Users must be able to operate transparently
across Grid boundaries. OSG program of work
focuses on Interoperability and Bridging of data
and jobs across these boundaries.
National International Infrastructures for
Science e.g. Teragrid, EGEE, NAREGI
Campus Regional Infrastructures e.g. CrimsonGr
id, GLOW, NWICG
National International Infrastructures for
Science e.g. Teragrid, EGEE, NAREGI
Campus Regional Infrastructures e.g. CrimsonGr
id, GLOW, NWICG
National International Infrastructures for
Science e.g. Teragrid, EGEE, NAREGI
Campus Regional Infrastructures e.g. CrimsonGr
id, GLOW, NWICG
National International Infrastructures for
Science e.g. OSG, Teragrid, EGEE, NAREGI
Campus Regional Infrastructures e.g. CrimsonGr
id, GLOW, NWICG
Campus Regional Infrastructures e.g. CrimsonGr
id, GLOW, NWICG
CS/IT Campus Grids
29International Science Grid This WeekiSGTW
Katie Yurkewicz has made a recognised success
with SGTW over the past 18 months. Subscriptions
are above 1200. EU have approached the US to
have a join International weekly newsletter--
EUcontacts are Hannelore Hammerle EGEE and
Francois Grey from CERN IT. OSG will contribute
1/2 of the editorial effort to this through its
staff. Paul Avery will be our direct contact.
iSGTW will be launched in a couple of
months. Keep the articles coming!
30Welcome to those from near and far
- Most far --
- Bob Jones - Director of Enabling Grids for
EsciencE- II, CERN. - Simon Lin - Director Computing Center Academia
Sinica, Taiwan. - Kazushige Saga - NAREGI, Tokyo Institute of
Technology. - Dave Kelsey - Chair of Joint Security Working
Group and Rutherford Appleton Laboratory,
England. - Sergio Andreozzi - gLITE-JRA1 and INFN
- Most near -- Univeristy of Washington, Seattle
and nearby - - David Baker, Richard Coffey, James DeRoest,
Tony Hey, Margaret Romine, Oren Sreenby, Gordon
Watts
31I look forward to a thoughtful and productive
meeting and discussions with you all.
OSG is For the Community, By the Community,
Throughout the Community.