Triggering for B Physics at Hadron Colliders - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

Triggering for B Physics at Hadron Colliders

Description:

BTeV Trigger Work Breakdown Structure (WBS 1.8) ... Sheldon's Symposium (60th BDay) 12-1-05. 12-2-05. 9,999. 2d. 120d. 9,999. 9,999 ... – PowerPoint PPT presentation

Number of Views:160
Avg rating:3.0/5.0
Slides: 25
Provided by: erikg150
Category:

less

Transcript and Presenter's Notes

Title: Triggering for B Physics at Hadron Colliders


1
Triggering for B Physicsat Hadron Colliders
  • Erik Gottschalk

2
BTeV Trigger Work Breakdown Structure (WBS 1.8)
3
BTeV Trigger Work Breakdown Structure (WBS 1.8)
Sheldons Symposium (60th BDay)
12-2-05
4
Overview
  • Introduction and overview of the BTeV trigger
  • History of BTeV trigger RD
  • RD highlights
  • Trigger Movie
  • Outlook

5
Introduction to BTeV Trigger (circa 2005)
  • The challenge for the BTeV trigger and data
    acquisition system is to reconstruct particle
    tracks and interaction vertices for EVERY
    interaction that occurs in the BTeV detector, and
    to select interactions with B decays.
  • The trigger performs this task using 3 levels,
    referred to as Levels 1, 2, and 3L1 looks
    at every interaction and rejects at least 98 of
    min. bias backgroundL2 uses L1 computed
    results performs more refined analyses for data
    selectionL3 rejects additional background
    and performs data-quality monitoring Reject gt
    99.9 of background. Keep gt 50 of B events.
  • The data acquisition system (DAQ) saves all of
    the data in memory for as long as necessary to
    analyze each interaction, and moves data to L2/3
    processing units and archival data storage for
    selected interactions.
  • The key ingredients that make it possible to meet
    this challenge
  • BTeV pixel detector with its exceptional pattern
    recognition capabilities
  • Rapid development in technology FPGAs,
    processors, networking

Note see glossary at the end of this talk
6
Block Diagram of BTeV Trigger DAQ
L1 rate reduction 50x
L2/3 rate reduction 20x
7
BTeV Trigger RD
  • The design of the BTeV trigger and data
    acquisition system was the result of a decade of
    research and development.
  • Three phases of RD occurred during this time
  • Simulations to establish trigger requirements and
    develop a baseline design.
  • Prototyping of algorithms and hardware components
    to establish performance metrics and determine
    cost estimates.
  • Optimizing the design to reduce cost of
    construction and maintenance, and to address
    commissioning and operations.

8
History of Trigger RD (beginning in 1998)
  • 1998 Algorithms, Simulations
  • Baseline UPenn trigger for 3-plane pixel
    detector (Selove)
  • Alternative trigger algorithms (Gottschalk,
    Husby, Procario)
  • 1999 Algorithms, Simulations
  • Digital signal processor (DSP) timing
    optimization studies (Gao)
  • Systolic associative-array track finder (Husby)
  • UPenn trigger for mixed 2-plane 3-plane pixel
    detector (Selove)
  • Carnegie Mellon trigger for 2-plane pixel
    detector (Procario)
  • Megapixel trigger for 2-plane pixel detector
    (Gottschalk)
  • New baseline BB33 trigger for 2-plane pixel
    detector (Gottschalk)
  • 2000 Trigger timing optimization
  • BTeV Trigger Movie (Gottschalk)
  • DSP timing optimization studies for BB33 (Gao)
  • Reduction in size of non-bend view pixel planes
    (Pixel Group)

9
History of Trigger RD (cont.)
  • 2001 Trigger timing optimization, RTES
  • DSP timing on Texas Instruments TMS320C6711
    (Wang)
  • BB33 algorithm modified for staggered pixel
    half-planes
  • Introduction of 8-fold trigger/data-acquisition
    system highways
  • Real Time Embedded Systems (RTES) Project for
    fault tolerance and fault-adaptive computing (5M
    NSF Grant)
  • 2002 FPGA (Field Programmable Gate Array) and
    DSP hardware
  • BTeV descoped from a double-arm to a single-arm
    spectrometer
  • DSP optimization on Texas Instruments TMS320C6711
    (Wang)
  • L1 timing for commercial-off-the-shelf (COTS)
    processors (Wang)
  • L1 FPGA functions implemented and simulated
    (Zmuda)
  • L1 prototype system with 4 DSPs (Trigger Group)

10
History of Trigger RD (cont.)
  • 2003 FPGA and DSP hardware, L1 L2 studies
  • Implementation of BB33 segment tracker in an FPGA
    (Zmuda)
  • Tests of the L1 4-DSP prototype (Trigger Group)
  • FPGA track triplet hash sorter to reduce L1
    processing time (Wu)
  • L1 trigger studies for 396 ns between bunch
    crossings (Penny K.)
  • L2 trigger timing improved by a factor of x120
    (Penny Kasper)
  • 2004 Baseline changes
  • Baseline change to replace L1 DSPs with COTS
    processors
  • Baseline change to replace custom switch with
    Infiniband switch
  • Evaulation of Blade Servers for L1 trigger (Wang)
  • Tiny Triplet Finder proposed to simplify FPGA
    algorithm (Wu)
  • 2005 Baseline change
  • Work on baseline change request to modify L1
    trigger architecture

11
Trigger RD Highlights
  • 1999
  • New baseline BB33 trigger for 2-plane pixel
    detector
  • 2000
  • BTeV Trigger Movie
  • 2001
  • Introduction of 8-fold trigger/data-acquisition
    system highways
  • 2002
  • L1 prototype system with 4 DSPs
  • 2004
  • Baseline change to replace L1 DSPs with COTS
    processors
  • Baseline change to replace custom switch with
    Infiniband switch

12
Eight-Fold Trigger/DAQ Highway Architecture
13
Slide taken from trigger status talk (April 21,
2001)
Introducing the Eight-Fold Highway
  • The DAQ group liked our 8-fold split in the L1
    pixel trigger, so they decidedto look into this
    for the entire data acquisition system and came
    up with someintriguing possibilities.
  • Questions
  • What are the trade-offs?
  • What is the impact on front-end electronics?
  • What if the L1 trigger group wants to use a
    four-fold split (instead of 8)?
  • Note Sheldon wins the naming contest by
    suggesting the word highway.

14
L1 Trigger 4-DSP Prototype System (2002)
15
Change to COTS L1 Track/Vertex Hardware (2004)
Other detectors
56 inputs at 45 MB/s each
L1 buffers
Level 1 switch
33 outputs at 76 MB/s each
GL1
ITCH
PTSM network
Track/Vertex Farm
L2/3 Switch
L2/3 Farm
1 Highway
16
LHCb Trigger System
  • Frederic Teubert
  • CERN, PH Department
  • on behalf of the LHCb collaboration

17
LHCb Trigger Overview
  • 40 MHz crossing rate
  • 30 MHz with bunches from both directions
  • Luminosity 21032 cm-2 s-1
  • 10 to 50 times lower than _at_ ATLAS, CMS
  • LHC rates
  • (for visible events ? at least 2 tracks in
    acceptance)
  • Total rate (minimum bias) 10 MHz
  • bb 100KHz
  • Whole decay of one B in acceptance 15KHz
  • cc 600KHz

Pileup system
VELO Trigger tracker
Calorimeters Muon system
18
LHCb Level-1 Performance
19
  • Happy Birthday Sheldon!

20
  • Additional Slides

21
Trigger/DAQ Glossary
DCB Data Combiner Board DDR Double Data
Rate FCC Feynman Computing Center FPGA Field
Programmable Gate Array GBE Gigabit
Ethernet GL1 Global Level 1 Infiniband Third
generation high-speed networking
standard ITCH Information Transfer Control
Hardware L1B Level 1 Buffer L2/3 Package
1 Components needed to complete software for L2
trigger L2/3 Package 2 Components needed to
complete software for L3 trigger PCI Peripheral
Component Interface PCI-Express High-speed serial
version of PCI PCR Project Change
Request PPST Pixel Preprocessor and Segment
Tracker PTSM Pixel Trigger Supervisor and
Monitor RCS Run Control System RTES Real-Time
Embedded Systems Xserve G5 Apples PowerPC based
1U server with dual 64-bit processors
22
Change to upstream event builder architecture
described in BTeV-doc-3342
23
Technical Progress Since CD-1 Review
  • Modified baseline architecture for the L1 trigger
    by replacing two of three custom-designed trigger
    subsystems with commodity hardware. The revised
    WBS has 8 GHz PowerPC processors (consistent
    with IBM roadmap) Inifiniband switches.
  • Performed L1 network simulations
  • Reviewed results in the trigger group
  • Presented results to BTeV Tech. Board
  • PCR approved August 2004
  • Purchased and installed 16 Apple G5(dual 2 GHz)
    nodes and an Infinibandswitch at Feynman
    Computing Center
  • Started evaluation of real-timeoperating systems
    for the L1 trigger
  • Acquired and installed a 100-nodepre-pilot farm
    for the L2/3 trigger (located next to the L1
    hardware).

24
L1 Trigger 4-DSP Prototype System (2002)
Write a Comment
User Comments (0)
About PowerShow.com