Applications Area - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Applications Area

Description:

SPI Software process infrastructure. Software and development services: external ... DICT: Reflexion system, meta classes, CINT and Python interpreters ... – PowerPoint PPT presentation

Number of Views:22
Avg rating:3.0/5.0
Slides: 28
Provided by: pauld181
Category:

less

Transcript and Presenter's Notes

Title: Applications Area


1
Applications Area
  • SPI, ROOT, POOL, Simulation

2
Overview
  • SPI Software process infrastructure
  • Software and development services external
    libraries, savannah, software distribution,
    support for build, test, QA, etc.
  • ROOT Core libraries and services
  • Foundation class libraries, math libraries,
    framework services, dictionaries, scripting, GUI,
    graphics, SEAL, etc.
  • POOL Persistency framework
  • Storage manager, file catalogs, event
    collections, relational access layer, conditions
    database, etc.
  • SIMU Simulation project
  • Simulation framework, physics validation studies,
    MC event generators, Garfield, participation in
    Geant4, Fluka, etc.

3
Presentations
  • Pere Matu
  • Overview, including SPI and most of Simulation
  • Rene Brun
  • ROOT
  • Giacomo Govi
  • POOL
  • Paolo Bartalini
  • Simulation generator subproject
  • No detailed talks on SPI and rest of Simulation

4
AA Structure
Alice
Atlas
CMS
LHCb
MB
LHCC
Work plans QuarterlyReports
Reviews Resources
Chairs
Architects Forum
AA Manager
Application Area Meeting
Decisions
LCG AA Projects
SIMULATION
ROOT
POOL
SPI
WP1
WP1
WP2
Subproject1
WP2
WP3
WP1
WP2
WP1
External Collaborations
Geant4
EGEE
ROOT
  • Many decisions now made in Architects Forum
  • Experiments driven seems to be working

5
Effort resources in 2005
Large numbers of FTEs
6
Phase II planning exercise
  • Process initiated in February
  • Had a round of discussions with PH management,
    project leaders, experiment architects,
    experiment representatives, etc.
  • Presentations to the experiments
  • AA internal review, March/April outcomes
  • The proposed evolution plan is technically
    reasonable and supported by all experiments
  • The technical details of the plan should
    continue to be discussed and approved by the
    Architects Forum
  • Many discussions in the AF (during May-July)
  • Ironing out some of the problems in the SEAL/ROOT
    merger
  • The final plan approved by PEB (September)
  • Very positive outcome

7
Effort resources beyond 2005
  • Seems sufficient, but falling profile will make
    management harder
  • Effort identified as THE major issue at last
    review seems more stable now
  • Must consolidate existing software, not start
    speculative new projects
  • Must put existing effort to optimal use this
    comes up here several times

8
SPI Software Process Infrastructure
  • Recent redefinition of its role
  • Reduce effort dedicated exclusively to SPI
  • Encourage direct participation from/to
    development projects
  • Service the LHC experiments more than the AA
    projects
  • Must provide services directly to experiments
  • Needs to change the focus
  • Cope with reduction of manpower
  • Participation from people associated with the AA
    projects
  • Optimization and automation of procedures
  • Prioritization of tasks
  • Not a lot of room for new developments
  • Work must be guided and prioritised by Architects
    Forum
  • Generic problem with software reviews
  • How to quantitatively measure progress in these
    items in future?

9
Future areas of work
  • Savannah service
  • Bug tracking, task management, download area, etc
  • gt160 hosted projects, gt1350 registered users
  • Users doubled in one year indication this is
    starting to be heavily used
  • Software services
  • Installation and distribution of software
    (external and LCG AA projects)
  • gt90 external packages installed in the external
    service
  • Update and maintain configuration information for
    the build systems
  • Software development service
  • Provide tools for development, testing,
    profiling, QA
  • Provide scripts and documentation adapted to LCG
    context
  • Documentation
  • Maintain and improve existing web pages and
    automate content wherever possible

10
ROOT
  • Covers nine areas
  • BASE Foundation and system classes,
    documentation and releases
  • DICT Reflexion system, meta classes, CINT and
    Python interpreters
  • I/O Basic I/O, Trees, queries
  • PROOF parallel ROOT facility, xrootd
  • MATH Mathematical libraries, histogramming,
    fitting
  • GUI Graphical Uner interfaces and Object editors
  • GRAPHICS 2-D and 3-D graphics
  • GEOM Geometry system (ALICE only!)
  • SEAL Maintenance of the existing SEAL packages
  • Concentrate on two here
  • SEAL and PROOF
  • N.B. ROOT is both internal and external package
  • Some ambiguity potential (but not actual, yet)
    issue of how to set priorities

11
SEAL and ROOT merger
  • The main motivations for the merger were
  • To optimize use of resources and ease the
    long-term maintenance by avoiding duplicate
    development
  • To provide a coherent set of products developed
    and maintained by AA for the experiments
  • To have the ROOT development fully integrated in
    the LCG organization (planning, milestones,
    reviews, resources, etc.)
  • N.B. No significant extra effort allocated for
    merger
  • Longer term, removal of duplication should reduce
    effort needed
  • Merge of the development teams into a single team
  • Happened immediately overall package called ROOT
  • Merge of two sets of software into a single set
    of core libraries
  • Originally foreseen to be completed quickly
  • But some issues need to be resolved first

12
Merging functionality (sorted by priority)
13
Slow transition for experiments
time/versions
SEAL
Expts S/W
ROOT
functionality
  • This will not happen immediately
  • Experiments asked to continue maintenance of the
    SEAL libraries
  • Intend to implement changes gradually could be
    two (or more) years
  • Currently plan to maintain SEAL functionality as
    long as the experiments require it

14
Comments
  • Semi-indefinite parallel support for SEAL is a
    large commitment
  • Directly conflicts with the first motivation for
    merger, i.e. to remove duplication and hence
    reduce effort
  • Little incentive for experiments to convert
    (given how busy they are)
  • Disagreements will delay conversion of some parts
  • Other parts may never be converted
  • Both mean there is no clear replacement for
    experiments for some time
  • Recommend
  • Converge on agreement for contentious parts asap
  • Set strict milestones for conversion
  • Enforce with deletion of existing SEAL libraries
    from releases after these times
  • Experiments need to put in the effort at some
    point now better than 2007

15
PROOF
  • Basically just ROOT on a distributed system
  • But with interactive and batch commands, status
    information, etc.
  • Potential to interact directly with event data

16
PROOF status
  • Allows analysis of ntuples using multiple CPUs in
    parallel
  • Can always be done by hand instead
  • but more convenient/less bookkeeping/less error
    prone
  • Likely the experiments will want something along
    similar lines
  • XROOTD plays an important role in PROOF
  • This will continue development teams cooperate
    closely
  • Still a lot to do to have a good integration of
    XROOTD with other services like CASTOR
  • Dedicated PROOF farm exists
  • 32 dual processor nodes, 800 MHz.
  • Sufficient for code development
  • Need more powerful farm in future to allow
    meaningful user tests

17
Comments
  • Relatively large FTE effort
  • Biggest workpackage within ROOT (which is biggest
    project within AA)
  • Intended use by experiments not yet clear
  • Only ALICE firmly committed and putting in effort
    directly only their demonstration used PROOF
  • Some people involved have CMS connections
  • Other experiments said to be interested
  • This large effort without definite commitment is
    not ideal
  • Something like PROOF will be useful (or even
    necessary?)
  • Given the track record of the people involved,
    PROOF is likely to be very useful but this should
    be for the experiments to decide
  • Experiments should make a decision and not waste
    other effort on alternatives if they choose PROOF
  • To use PROOF (or other product) most effectively,
    should design data structures with this in mind

18
POOL
  • Data persistency, data management, deployment on
    the Grid and (relational) databases in general
  • The POOL project contains a number of packages
  • POOL, CORAL, COOL
  • Previous effort mainly in POOL general data I/O
  • No major changed planned for the future
  • Moved to maintenance and support very positive
    development
  • Focus now moving to CORAL/ORA and COOL database
    issues
  • CORAL provides database-independence
  • COOL provides the timestamped conditions data
    handling
  • Completed first phase of developments
  • Moving to deployment and experiment support
  • No major issues apparent during review

19
POOL overview
ROOT I/O
STORAGE MGR COLLECTIONS
POOL API
Oracle
FILE CATALOG
RDBMS
USER CODE
MySQL
SQLite
COOL
COOL API
CORAL
20
POOL status
  • Current priorities concern deployment issues for
    Atlas and LHCb
  • Performance validation and optimization
    deployment has started for Atlas
  • Performance validation and optimization
  • Definition of realistic experiment workloads
  • Software optimization (identify and solve
    performance bottlenecks)
  • Data extraction and replication tools and tests
  • Basic tools for data extraction and replication
    exist
  • Start the design of a distributed data model and
    API for a COOL database
  • Conditions Database
  • Significant performance improvements
  • Python interface (PyCOOL)
  • Being validated by ATLAS and LHCb
  • Have run at conditions database maximum expected
    full load
  • But conditions data load of experiments not
    well-defined yet

21
Simulation overview
Experiment Validation
Geant4 Project
Fluka Project
MC4LHC
Simulation Project G.Cosmo
Framework V. Pokorski
Geant4 J.Apostolakis
Fluka A.Ferrari
Physics Validation A.Ribon
Garfield R.Veenhof
Generator Services P.Bartalini
WP
WP
WP
WP
WP
WP
WP
WP
WP
WP
WP
WP
WP
WP
WP
22
Simulation progress
  • New Garfield package simulation of gaseous
    detectors
  • Implemented interfaces with Maxwell 2D and Tosca
    packages.
  • Implemented interface with a new version of
    Magboltz (program computing the transport
    coefficients for electrons in gas mixtures) and
    with a new version of Heed (program simulating
    ionization processes in gas mixtures)
  • Simulation framework
  • New converters (in Python/C) to import/export
    detector descriptions in Geant4, Flugg and ROOT
  • Prototype implemented for direct object
    persistency of Geant4 detector description
    classes (geometry, materials)
  • Physics validation good progress (concern in
    previous review)
  • ATLAS TileCal 2002 test-beam study response of
    Geant4 and Fluka more expected later from
    combined test beam
  • First results on LHCb background radiation
    studies with Geant4
  • Results on Geant4 longitudinal hadronic showers
    from Atlas and CMS

23
Geant4 and Fluka
  • Geant4 progress
  • New major release this year
  • Validation testing suite created and adapted for
    use in batch and GRID environments
  • LCG note (CERN-LCGAPP-2005-02) of Geant4
    validation in LHC experiments simulation
    production
  • New Collaboration Agreement finalised and sent to
    the Geant4 Collaboration Board (CB) members for
    final endorsement
  • New release of Fluka in July
  • Releases flagged as an issue in last review
  • This is first under the INFN-CERN agreement
    (including the source!)
  • Installed within the LCG area real commitment to
    integration
  • Good news after around a decade of issues
  • Flugg also updated to allow common geometry
    source as new Geant4
  • Both very positive developments

24
Generator services
  • Aim for unified approach to MC truth
  • Not clear if important given falling level of
    effort
  • How much is this driven by experiments?
  • Generators
  • Pythia8 and Herwig both seem low on effort
    Herwig states it is impossible to meet LHC
    demands without more effort
  • Probably neither essential in terms of
    new/improved physics processes before LHC
    start-up
  • EvtGen taken LHC version but significant
    modifications of B and D decays expected from B
    factories in future must ensure upgrades are
    included
  • Need to choose future Fortran compiler to replace
    g77
  • Fortran still used in underlying routines
    (particularly generators)

25
Main milestones (some achieved)
Needs experiments
Less than a year from now ?
26
Main milestones (cont)
  • Many milestones are effectively qualitative
  • adapt, apply, develop, release, etc.
  • Hard to evaluate if met successfully

27
Conclusions
  • No show stoppers, a lot of progress
  • The major concerns about effort from the last
    review have been stablised
  • AA people should be congratulated on progress and
    management
  • Concerns are reasonably minor
  • Mainly to optimise the use of effort which is
    limited and falling in future
  • ROOT/SEAL merger is well underway
  • But may be a long time before full benefits of
    effort reduction are seen
  • Recommend tightening schedule and removing
    open-ended commitment
  • PROOF developments
  • Not sufficient definition of future directions
    from experiments
  • Recommend experiments decide and PROOF effort is
    adjusted accordingly
  • Success is defined by adoption and validation of
    the products by the experiments
  • Excellent, but for reviewers, this is only
    apparent after completion
Write a Comment
User Comments (0)
About PowerShow.com