LHCb Computing Status Report DAQ , ECS , Software, Facilities - PowerPoint PPT Presentation

About This Presentation
Title:

LHCb Computing Status Report DAQ , ECS , Software, Facilities

Description:

Throttle. Front-End Multiplexers (FEM) ... Readout and Throttle Switches. Programmable for Partitioning' Design reviewed in October 00 ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 15
Provided by: NIC8169
Category:

less

Transcript and Presenter's Notes

Title: LHCb Computing Status Report DAQ , ECS , Software, Facilities


1
LHCb Computing Status Report DAQ , ECS ,
Software, Facilities
  • John Harvey
  • CERN / EP
  • Meeting with LHCC Referees
  • 27 - Nov 2000

2
LHCb Trigger/DAQ/ECS Architecture
3
LHCb TFC system
  • Readout and Throttle Switches
  • Programmable for Partitioning
  • Design reviewed in October 00
  • Prototypes in February 01
  • Readout Supervisor
  • Functional Specification done
  • Design in progress
  • Review scheduled for April 01
  • Prototype scheduled for Oct 01
  • TFC system test
  • Acquire components
  • Test L1 broadcast via channel B
  • Start June 01

4
LHCb Readout Unit
  • New version of RU design (v2)
  • less chips, fewer layers, lower cost
  • First prototype in first week of December 00
  • Programming of FPGA code for implementing readout
    protocols underway
  • Working and tested modules expected by end Jan
    01
  • Integration tests will start Mar 01

5
LHCb Event Builder
  • Studied Myrinet (buffers needed)
  • Now focusing on Gbit Ethernet
  • Test setup between 2 PCs
  • Use gt95 nominal bandwidth for frames gt512 bytes
    (512 -gt230 kHz)
  • Can send out frames at frequencies up to 1.4 MHz
    for 64 byte frames
  • Implement event building in NIC
  • Frequency of 100 kHz demonstrated
  • EB at Gbit speeds for gt 1kB frames demonstrated
  • Tested 1 on 1 event building over a switch in CMS
    test bed
  • Fixed protocol with RU
  • Presented results at DAQ2000
  • Now studying flow control in switch and making a
    full scale test on CMS test stand (16 on 16)
  • More detailed simulation of a full scale GbE
    readout network to be done

CERN Network
6
ECS interface to Electronic Modules
  • Select a reduced number of solutions
  • Support (HW and SW) for the integration of the
    selected solutions
  • Ethernet to credit card PC for use in counting
    room
  • Test board being developed
  • Study i/f of CC-PC to Parallel bus, I2C, JTAG
  • Test functionality (RESET) and measure noise
  • Prototype expected in January 01
  • Results end of March 01
  • N.B. two other solutions considered for use in
    high level radiation areas
  • SPAC Long Distance I2C/JTAG
  • CMS tracker CCU

7
ECS Control System Prototype
  • Aim is to distribute with the SCADA license a
    framework where users can easily implement
    sub-systems.
  • First prototype comprises
  • Configuration rules and conventions (naming,
    colors, etc)
  • Tools for Device Integration
  • Hierarchical Control Partitioning
  • Based on Finite State Machines and SCADA
  • Automatic UI Generation
  • Based on SCADA
  • Plans are to use prototype in the LHCb test beam

8
Online team members
Leaving In 2000
  • DAQ
  • Beat Jost project leader CERN staff
  • Jean-Pierre Dufey Readout Network CERN staff
  • Marianna Zuin Readout Network technical
    student
  • Richard Jacobsson TFC CERN staff
  • Niko Neufeld Readout Network CERN Fellow
  • EP/ED group Readout Unit CERN staff
  • Zbigniew Guzik engineer Warsaw
  • ECS
  • Clara Gaspar project leader CERN Staff
  • Wolfgang Tejessy JCOP CERN Staff
  • Richard Beneyton SCADA in test beam Cooperant
  • Sascha Schmeling SCADA test beam CERN Fellow

Arriving in 2000
9
Software Framework - GAUDI
  • GAUDI v6 released on Nov 10th
  • Enhanced features e.g. event tag collections, XML
    browser, detector geometry and event display
  • 110,000 lines of code, 460 classes
  • Good collaboration with ATLAS
  • New services by ATLAS auditors, histograms in
    ROOT, HEPMC
  • Moving to experiment-independent repository
  • The most urgent tasks for next release are
  • Event model with subdetector groups
  • Detector description with subdetector groups
  • Conditions database with CERN/IT
  • Consolidation and enhancements (code
    documentation Doxygen)
  • Further contributions expected from ATLAS
  • Scripting language for interactive work
  • HARP, GLAST and OPERA also users of GAUDI

10
Software Applications
  • GAUDI-based event reconstruction (BRUNEL)
  • BRUNEL v1 r5 released this month
  • Physics functionality entirely based on wrapped
    FORTRAN code
  • First public release of C track fit integrated
    tested
  • Pile-up implemented and spill-over being
    implemented
  • Available for production tests
  • Migration of detector software to C
  • Progress in all areas - digitisation, geometry
    description,
  • Tracking - digitisation and pattern recognition
    almost ready for public release
  • e.g. VELO, CAL complete event model and
    geometry description
  • Current activities reflect TDR schedule
  • VELO, MUON (and Tracking) now on hold until after
    TDRs produced
  • RICH and CAL new software getting higher
    priority

11
Software Applications GEANT4
  • Developing interface to G4 for GAUDI applications
    (GiGa)
  • Isolates G4 code from GAUDI
  • Way to input detector geometry and kinematics to
    G4
  • Handles passing of commands to G4 and retrieval
    of events from G4
  • More generally Geant 4 physics being tested
  • Now - by BaBar, ATLAS, ALICE, Space applications
  • Some disagreements with data and G3 seen and
    being studied
  • Plans in LHCb
  • Calorimeter simulation of shower production in
    prototype and comparison with existing
    simulations (G3) and data from testbeam
  • RICH studying production and detection of
    Cherenkov photons in RICH1 using TDR geometry
    compare results
  • Integration of these developments in GAUDI using
    GiGa
  • Measure performance

12
LHCb CORE Software Team
Left In 2000
  • Pere Mato project leader CERN staff
  • Florence Ranjard code librarian CERN staff
  • Marco Cattaneo BRUNEL CERN staff
  • Agnieszka Jacholowska - SICb Orsay
  • Markus Frank GAUDI CERN staff
  • Pavel Binko GAUDI CERN staff
  • Rado Chytracek GAUDI doctoral student
  • Gonzalo Gracia GEANT4 CERN fellow
  • Stefan Probst - GAUDI technical student
  • Gloria Corti GAUDI CERN fellow
  • Sebastien Ponce - GAUDI doctoral student
  • Ivan Belyaev (0.5) GAUDI/GEANT4 ITEP
  • Guy Barrand (0.5) event display ORSAY

Arriving in 2000
13
Computing Facilities
  • Estimates of computing requirements updated and
    submitted to Hoffmann LHC Computing Review
  • NT farms being decommissioned at CERN and at RAL
  • Migrating production tools to Linux now
  • Starting production of 2M B-inclusive events at
    Liverpool
  • Farm of 15 PCs for LHCb use at Bologna early 2001
  • Long term planning in INFN going
    on(location,sharing etc)
  • Farm of 10 PCs to be set up early 2001 in NIKHEF
    for LHCb use
  • Developing overall Nikhef strategy for LHC
    computing /GRIDs
  • Grid Computing in LHCb
  • Participation in EU Datagrid project starts Jan
    2001 (3 years)
  • Deploy grid middleware (Globus) and develop
    production application
  • Started mini-project between Liverpool/RAL/CERN
    for testing remote production of simulation data
    and transfer between sites

14
LHCb Computing Infrastructure Team
  • Frank Harris coordination Oxford
  • Eric van Herwijnen MC production/Grid CERN
    staff
  • Joel Closier system support/bookkeeping CERN
    staff
Write a Comment
User Comments (0)
About PowerShow.com