Overall Detector Performance Working Group - PowerPoint PPT Presentation

1 / 8
About This Presentation
Title:

Overall Detector Performance Working Group

Description:

Comparison of different sub-detector design/technology options in the context of ... and one (maybe) new package (so far: Fortran/C , BRAHMS/MOKKA, GEANT3/4) ... – PowerPoint PPT presentation

Number of Views:15
Avg rating:3.0/5.0
Slides: 9
Provided by: markussc2
Category:

less

Transcript and Presenter's Notes

Title: Overall Detector Performance Working Group


1
Overall Detector PerformanceWorking Group
  • Convenors Pascal Gay,
  • Markus Schumacher , Mark Thomson

Prague, 17.11. 02
  • Charge of the WG
  • Strategy
  • How to do ?
  • Who does what ?
  • Open Questions
  • Towards Amsterdam

2
Charge of the Working Group
  • Evaluation of the detector performance
    considering the whole detector
  • Comparison of different sub-detector
    design/technology options in the context of the
    detector as a single entity
  • Provide a forum for discussion on all issues
    related to the overall detector performance with
    participation from detector RD, simulation and
    physics groups (including our colleagues from
    the North American and Asian LC-Workshops)

3
Strategy
  • Obtain key performance figures
  • Perform reconstruction of physics events using
    information from all subdetectors with full
    simulation
  • Study influence of machine backgrounds
    (occupancy), overlaping events, time structure of
    accelerator,
    (mis)allignment and calibration issues,
    issue of crossing angle of the
    colliding beams
  • Develop ID/reco. tools and compare them using
    well defined benchmark processes
  • Extract parametrisations for / transfer
    algorithms to fast simulation packages (e.g. done
    for flavor tagging)
  • Perform comparison of physics results between
    full and fast simulation
    different detector technology
    reco. algorithm options
  • Provide inputs to the discussion of
    cost/performance issues

4
How to do it ?
  • common simulation framework for a fair comprison
    agree on language and one (maybe) new
    package (so far
    Fortran/C, BRAHMS/MOKKA, GEANT3/4)
  • provide a common data format for easy use
    in physics studies
  • define and agree on benchmark processes for
    comparsion (eegWW,ZZ,ttH for jet reco., jet
    separation, E resolution) (eegWW gqqlnu for low
    angle tracking, flavor charge ID)
  • Identify key analysis tools to be developed
    ( reconstruction
    algorithms, ID tools,
    accelerator alignment issues)

  • Identify key performance numbers to be determined

    ( E, p, d0 resolutions, reco and tagging
    eff., fake rates ... )
  • More detailed and complete list of
    benchmark processes, tools and
    key performance numbers on working group web page

5
Who does what ?
  • detector groups provide inputs about
    technology options,
    detector design (granularity, material budget,
    readout time), very basic performance figures
    (point/energy/cluster resolutions, dependence on
    number of cells etc.), noise, ocupancy ....
  • simulation/software groups
    implement or ensure
    implementation of above in simulation,

    provide common simulation framework and data
    format
  • detector groups and simulation/software groups
    develop and
    implement basic reconstruction algorithms
    ( providing tracks, clusters )
  • overall detector performance group and
    simulation/software groups provide/ensure
    existence of analysis tools
    ( e.g. event reconstruction,
    ID packages )
  • overall detector performance group
    define
    performance criteria
    collect result
    from various groups and distribute those
    information
  • overall detector performance group and all
    (including physics groups)
    evaluate the
    performance figures / compare options

6
Analysis tools/topics and Performance Criteria
  • Tools and Topics
  • PID reconstruction for e,m
    ,p0,V0,h0,g ,conversions, ....
  • quark flavor and charge tag
  • dE/dx id algorithms
  • implementation of overlapping
    events, alignment calibration, several
    bunch crossings, beam crossing angle
  • Performance Criteria
  • define physics processes to be studied as
    benchmarks
  • specify figures to compare (e.g.)
    reco./id/selection effiencies rejection
    factors / fake rates resolutions D E, D Mij ,
    D Mmiss, ...
  • compare those figures for 1) different
    event topologies 2) technology
    reco. options 3) background levels, allignment
    and calibration accuracies

7
Some Open Questions and Topics
  • influence of machine backgrounds due to and
    time structure of accelerator (vs.
    readout time)
    (e.g.
    assignment of tracks to bunch crossings)
  • issue of (mis)alignment, calibration, dead
    channels
  • hermeticity, forward veto vs. machine
    backgrounds
  • influence of non vanishing crossing angle
  • new and/or more sophisticated analysis tools
    (e.g. quark charge tagging, b (c)
    vs. bbar (cbar) sophisticated
    implementation of dE/dx for particle ID)

( no complete list, examples from personal
preference)
8
Towards Amsterdam
  • Today
    get overview of
    detector technologies, their implementation and
    status of simulation and availability and
    sophistication of ID/analysis tools
  • Today Next Weeks
    1) find volunteers ? for
    all the open topics 2) decide
    on and provide common simulation
    framework and data format
  • Until Amsterdam
    1) establish common
    framework and data format 2) write
    documentation and provide it to all users
    3) exchange information between working groups
    4) do first studies for evaluation of
    performance
  • Amsterdam
    plenty of reports on new
    and interesting results
Write a Comment
User Comments (0)
About PowerShow.com