Title: Manchester Christmas Meeting 2006
1Manchester Christmas Meeting 2006
- OPAL
- ATLAS
- Trigger system commissioning
- Trigger configuration
- Trigger information in all data formats/DB
- Trigger analysis tools
- Top physics analysis
- Large scale testing of HLT farms on the Grid
- Our group effort in trigger and physics analysis
- Jet algorithm studies at the Tevatron/LHC (M.
Wobisch) - SUSY in e? collisions at the LC (A. Oh, A. De
Roeck)
Thorsten Wengler Manchester, 21 Dec 06
2Trigger Commissioning
The ATLAS Trigger/DAQ project
T.W.
3We have built the first complete trigger chain in
the pit
- and taken data, e.g. here with field on!
- Final trigger electronics
- Using central trigger logic and timing signals
- Also integration with HLT has started
Cosmic data taking 18./19. Nov 2006, toroid field
on
4Installation is far from finished
- First out of 6 big wheels installed
- Fully equipped with trigger electronics
- Access and testing in the pit are a nightm
challenge! - First wheel took 6 month to install
- Time for remaining wheels is 6 month.
- Calorimeter trigger installation going on
TGC detectors with on-detectortrigger
electronics in cavern
5Trigger configuration
With Simon
- Working on main interface to operate the trigger
online - Java GUI
- DB structure
- Data handling offline
- Just had major review
Assignment of trigger thersholds to input cable
lines
Master view of Lvl1 configuration
6How to analyse trigger data in ATLAS
- CERN-ATL-COM-DAQ-2006-044
- With J. Haller, A. Hoecker, J. Stelzer
- What trigger data is needed to account for the
trigger in analysis - Where is it stored
- How is it accessed
- How does it fit into the picture of how we will
do analysis in ATLAS - ATLAS reference paper
- Editor of TDAQ/Lvl1 section
- To be submitted by June 07
7Next round of ATLAS physics notes (CSC)
- Development of trigger analysis tools
- With Simon, A. Krazsnahorkay, D. Berge
- Now used ATLAS wide
- Main editor for note on triggering top events
(Spring 07) - Analysis of top events
- Trigger studies (above Un-ki, Jenna, Jiri)
- Top cross section
- Include top properties with help from Christian
8Manchester GRID Tier 2 cluster as ATLAS HLT test
site
With A. Forti, S. Wheeler, H. Garitaonandia
- Up to 900 nodes
- Large system on short notice
- Regular release testing
Full dummy slice 64 nodes
77 node LVL2 system ran 48hrs
Full dummy slice 10 nodes
9Our group effort in trigger and physics analysis
not (yet) counting FP420/diffractive here.
- Resources needed (minimum)
- Enough people to make an impact and establish the
group as the experts in one or more areas, say a
50 involvement of - 3 or more academic and research staff
- 2 or more postdocs
- 3-4 or more students
- Money for local computing equipment and travel
o.k.
o.k.
-1(2)
From last years Christmas meeting
- Trigger
- James, Paul B., Jiri - HLT running/monitoring
- Simon, Thorsten - configuration
- Analysis (including trigger studies)
- Paul B. - QGC
- Jenna, Fred, Simon, Thorsten, Un-ki, (Christian),
(Paul M.) - top x-section, properties, top to
charged Higgs, Joe - MC verification - Jiri - jet physics (QCD?/top?)
- Steve, Paul M. - magnetic field mapping, ID
alignment
10Backup Slides
11We have built the first complete trigger chain in
the pit
Calorimeters
RPC Detectors
TGC Detectors
Calorimeter Trigger
RPC muon Trigger
TGC muon Trigger
Mu-CTP Interface
Central Trigger Processor
LTP(I) /BUSY TTC
RoIs
Front-ends
Lvl2 Trigger
12ATLAS PIT, Jura side, 17 Dec 2006
13Linear collider studies SUSY
Assume and discovered at ?s 1TeV LC,
but gt 1/2 ?s ? can use e? collisions with
polarized photons to extend range
pe vs. cos(?e)
Sherpa TESLA Simdet 4 Det.-Sim.
Mo 500 GeV, P1
A.DeRoeck, A.Oh, T.W.
ECFA_at_ Durham04
Mo 700 GeV, P1
Will hopefully still lead to a paper.
Barger, Han, Kelly, hep-ph/9709366
etc
14Manchester Tier2 Grid Cluster for ATLAS HLT tests
- Aim Provision of Manchester Tier2 GRID cluster
as a medium to large scale TDAQ testbed facility
of up to 900 nodes, available at short notice
(one week) - Use HLT testing regular large scale multi-node
testing (e.g. nightlies) - Immediate plans consolidation and further
testing of grid tools user documentation - Regular maintenance Track pre-series releases
regular running of a full partition regular
reports to this(?) meeting - Longer term debugging of TDAQ configurations and
configuration database generation tools, HLT
testing (algorithms) etc.
15Status
- grid-tdaq-tools have been developed
- reserve nodes provide list of reserved nodes
(input to other tools) - start pmg agents
- log collection
- log cleanup
- kill processes
- debug tools
- Image from point1 installed can run
independently (without further modification) - tdaq-01-04-01
- tdaq-01-06-01
- Will be usable by anyone with a valid GRID
certificate (and who is a member of the ATLAS VO)
16Technical spec on Manchester Tier-2
- 1000 nodes (100 for engineers only)
- Currently split in 2 separate clusters (different
s/w areas) - Gb/s interfaces connected to 1 Gb/s rack
switches each rack switchgroups 20 machines and
is connected to a top level 10Gb/s cisco switch - 2.8 GHz dual-cpu (Intel)
- OS SL3.0.4
- 4 GB of memory per node
- 500 GB disk per node
- http//www.gridpp.ac.uk/northgrid/manchester/
- http//www.gridpp.ac.uk/wiki/Manchester_(10_Questi
ons)