Title: Vertex 2001 Brunnen CH
1Talk Summary
- Will describe CDF Silicon Vertex Tracker (SVT)
- SVT is part of CDF Level 2 Trigger
- Design
- how it works
- what is special about it
- Construction
- how much hardware, which choices, installation
- Performance
- results from real Tevatron data in 2000 and 2001
- Commissioning experience
- what would we change if we could go back
- Conclusions
2Impact Parameter
3The SVT Algorithm (I)Pattern Recognition
- First stage at coarser resolution
- Hardware implemented pattern recognition at SVX
readout speed (similar to lookup table
approaches, almost no extra-time) - 32.000 roads for each 300 slice
- 250 micron Superstrips
- gt95 coverage for Ptgt2GeV
4The SVT Algorithm (II)Track Fitting
5The Device (II)
AM Sequencer
SuperStrip
AM Board
Hit Finder
Hits
Detector Data
Matching Patterns
Roads
Roads Corresponding Hits
Hit Buffer
L2 CPU
Tracks Corresponding Hits
Track Fitter
6SVT for Real!
7SVT Completed
8CDF Trigger Room
SVT
9Once it is builtthings we could not simulate
- The hardware must work correctly
- data integrity challenge
- Fitting tracks in 2-d only must be good enough
- Z-alignment challenge
10The it works ChallengeHow to make sure the
trigger takes the right decision
- Very complex system (gt100 boards)
- Large data flow (50 cable data path at 600Mb/sec)
- Large data reduction (only 1/1000 events makes to
tape) - Cant feed real data without real detector and
beam - SVT as an event builder synchronization !
- 12 wedges x 3 Hit Finder boards x 4 G-Links
288 optical links from SVX - Another 24 from Layer00
- 1 stream from XTRP (XFT) fanned into 12
- L1 trigger information
- combine at 50KHz in one output stream
- One board fails ? SVT halts
11Data Diagnostic Tools
- Simple, uniform, data-driven data paths
- Point-to-Point LVDS with asynchronous clocks and
FIFOs - Built-in Logic State Analyzers (Spy Buffers) on
each end - Allow to isolate problems to one board
- Allow data monitoring asynch to DAQ (SpyMonitor)
- Check data path synchronization at each
data-merging point - 8-bit BunchId tells (on average) data are coming
from the same event - Check data path parity and flag any illegal data
- Error flags in each board/event, freeze Spy
Buffers on error (global or local freeze) to look
at error cause, test signal on front panel to
trigger external test equipment on error
condition
12SVT error test point used by L1 groupfrom CDF
Electronic Logbook
2
4
From SVT crate
1
..through the floor..
to XTRP rack
to find the origin of (till then) mysterious
data synchronization problem !
3
13Z-alignment Challenge
- SVT integrates d distribution in z
- One average beam position in each 15cm SVX barrel
- Average is found and corrected for online
- Z-tilt gives resolution spread. Need lt 100?rad
- DETECTOR SVX ladders must be parallel to each
other - Defines an ideal line the detector axis
- BEAM must be parallel to the same line
- gross effects (once for all) corrected by SVX
move - fine tuning by accelerator in real time (even
during store) needs feedback from SVT
14Commissioning Run It Works
- October 2000
- started as SVT hardware/data-flow test
- SVX barrel 4
- Only 2 phi slices, only 2 z-barrels (instead of
6) - lower grade detectors, poor alignment
- No COT tracks to SVT (missing XTRP)
- Prove algorithm with simulation
- Nominal geometry
- Wide roads (500microns vs 250 design)
15Commissioning Run (III)COTSVX II Simulated
Offline (Correcting for Beam Z-tilt)
Impact Parameter Distribution from SVT simulation
on SVX raw data and XFT output
What we promised. From SVT TDR (96) using
offline silicon hits and offline CTC tracks
s 47 mm
s 45 mm
162001 CDF Run It Does the Job
- SVX inserted a piece at a time
- COT tracks still suffering from problem in track
distribution hardware (XTRP) - No L2 processors yet
- Work focused on SVT integration and configuration
software - demonstration of performance
- Still using
- Wide roads
- Nominal geometry
- No SVX pedestal adjustments
- No noisy/hot/dead channel corrections
- No correction for beam offset (Dy3mm Dx1mm)
17Our Inputs 1 - SVX
- good S/N
- resolution 15microns
- layers (mis)alignment 20microns (will correct
for in SVT) - 1/3 extra hit/mm/event (SVT Superstrip
500?250mm) - lt2.5 strips/cluster (3 used by SVT)
- had to deal with one-piece-at-a-time DAQ
integration and data integrity problems (just
takes time)
18Our Inputs 2 - XFT
- XFT working well since commissioning run
- XTRP still flaky
- data integrity problem, not quality
- XFT vs. offline COT reconstruction
- D? 4 mRad
- spec was lt8mRad
- DPt/Pt 1.6
- spec was lt2
19SVT Performance 2001 (I)Online Data d vs ? SVX
Only
28 Aug 2001 data, ?2lt40 no Pt cut
SVX only !
- Sinusoidal shape is the effect of beam
displacement from origin of nominal coordinates - Can find the beam consistently in all wedges
even using only SVX
20SVT Performance 2001 (II)Online Data d vs ?
SVXXFT
SVX COT !
- SVT real time monitor from SVT data logged in
event record (rather than sent to L2) - Beam position computed online in real time on
crate controller and subtracted online in SVT
Track Fitter - Original plan was to do it in L2 processors
COT HV problem
21SVT Performance (III)Fitting beam spot in each z
barrel
SVX COT
- From SVT output logged in event record (limited
statistic) - Still can not get data from all SVX
- Can clearly see the beam in each z-barrel to use
for z-slope correction
22SVT Performance (V)Dealing With Beam Z-tilt
Online
- From beam fitting program running on VME crate
controller using Spy Buffer data (all L1 triggers
at 50KHz) in real time - No correction for barrel-to-barrel misalignments
- Geometry still be be understood
Beam center in the 6 barrels (15cm each)
Beam center (y)
50 ?m
200 ?m / 30 cm 600 ?rad
- Can find beam position online on each barrel
- Will feed to accelerator for real time beam
alignment
Beam center (x)
23SVT Performance (VI)Real Data On-line Impact
Parameter Distribution
July 8 data, each wedge fitted separately, ?2lt50
Ptgt2GeV
SVX COT
- No detector alignment correction yet
- No pedestals, noise, cluster tuning
- Not corrected for spread due to beam z-tilt
- chi2 cut can and will be made tighter
- Already almost as good as simulation on real
Run1 data (goal is 40mm )
s 52 mm
Trigger cut
24Experience Summary
- Very challenging project
- At the boundary of hw capabilities when started
(96) - Todays chips to do it right
- Debugging/commissioning very long and not
finished - Slowed by
- Manpower on site
- Other parts of the system being still in
debug/commission - Need to look at data
- Software always much more effort then originally
thought - Have not started to look at
- low rate problems
- Noise effect/reduction
- Pattern/fit tuning, clustering optimization
25What We Learnt the Bad (I)things we should have
done differently
- Designed mainly for steady state operation,
more functions and flexibility for board/system
testing added on the road - Plan for long, demanding, integration/commissionin
g - No dip-switches, ISP everywhere
- Some obsolete/inadequate hardware, overcrowded
PLD/FPGA - redesign/rebuild
- Did not try hard enough to make all boards more
similar - generic boards with different firmware ?
- Plan for lack of input/output (done, but not
enough) - Must be able to test SVT in place, with proper
timing and data flow, even without
SVX/COT/L2/beam - Not only hw test, also operating/monitoring sw
26What We Learnt the Bad (II)things we should
have done
- Bring to each board DAQ control signals (Clock,
Accept, Reject, Buffer number) - Data driven good for data flow, synchronized
snapshots for debug - Be able to read some internal SVT data into the
event - Allow surviving errors from other systems (bad
BunchId) - Design for expansion
- Wider data path (connector limited!)
- More SVX layers, stereo data (?)
- Some electrical details
- Do not use width-expansion in Fifo
- Clockenable instead of asynch strobe
- R/W by VME each ram/fifo/pipeline register
27What We Learnt the Bad (III)things we did not
manage to do hard enough
- Should have invested much more in software much
sooner - Things we knew we had to do, planned to do, tried
to do, but failed to do in time - Common low level board access methods
- Common framework for board test
- Non-expert board test suites
- Bookkeeping and DB storage of configurations
- Online monitor
- Offline software
- Things we overlooked (underestimated)
- Too many things are known to too few people
28What We Learnt the Good (I)things we are glad
we did
- Independency from CDF DAQ (data driven device)
- design/build while DAQ specs are in flux
- Test boards away from Trigger Room
- Common data communication protocol (hw/sw)
- Boards as building bricks that can be combined at
will - Can adapt SVT configuration to various test
needs, both for us and others (L2 e.g.) - Ability to inject/read data from every board
- Can test most board functions with no additional
hardware - Low level software captures board objects (ram,
regs) in common framework, same high level code
on several boards - svtvme_sendData (board, data)
svtvme_readFifo(fifo, data)
29What We Learnt the Good (II)things that went
well
- Hardware fears that did not materialize
- BGA (easier then thought at least in U.S.)
- Chip replacement (done very little of)
- Better build more spare boards (cheaper)
- Paranoia about data integrity and data stream
synchronization pays off - Beam position finding in real time works
- A small group managed to build, operate and
properly configure SVT - a lot of work still to be done
- pattern/fit constants maybe easier then expected
30Conclusion
- SVT is living up to expectations
- Track fitting with offline quality in real time
is a reality - Trigger rates still be measured (designed on Run1
simulation) - Alignment
- Narrower roads
- More material then expected
- Is an Experiment
- SVT arrived in time but
- It would have been better to build with today
technology - There are too many internal differences
- Would have liked better software yesterday
- Plan for commissioning and debugging, not only
data taking
31The Best Advice
Hike the mountains !!
32SVT Requirements
Detector Raw Data
7.6 MHz Crossing rate 132 ns clock
- Good IP resolution
- ASAP (?10 ?sec)
- No Dead Time
Level 1 storage pipeline 42 clock cycles
Level 1 Trigger
- Level 1
- 7.6 MHz Synchromous Pipeline
- 5544 ns Latency
- 50 KHz accept rate
L1 Accept
Level 2 buffer 4 events
Level 2 Trigger
- Level 2
- Asynchromous 2 Stage Pipeline
- 20 ?s Latency
- 300 Hz accept rate
- Drop stereo info 2D tracking
- NEED BEAM PARALLEL TO SVX !!!!
- Parallel processing (12 300 slices)
- Data driven pipeline
L2 Accept
DAQ buffers
L3 Farm
Mass Storage (50100 Hz)
33The Device (I)
The AM chip is the physical realization of the
template matching pattern recognition algorithm
each AM Chip can compare each hit with all the
patterns in memory in parallel, providing the
high speed necessary for trigger
applications. With the AM, pattern recognition is
complete as soon as the last hit of an event is
read! It is the ONLY CUSTOM PART, everything
else is RAM, FIFO, PLD, FPGA
- Full custom VLSI chip
- 0.7?m technology
- 35mm2
- 180000 transistors
- 128 patterns, 6x12bit words each
- majority logic
- Working up to 40MHz
34An SVT Slice
35Commissioning Run (I)Results (SVX II Standalone
Tracking)
impact parameter distribution
geometrical constraint
s 87 mm
36Commissioning Run (II)SVX Standalone Vs Adding
COT Offline in SVT Simulation
SVT on silicon only good enough to find good
tracksloop on all SVT-COT track pairs and
compare parameters
? SVT COT
Curvature SVT - COT
37SVT performance (IV)beam tilt is there !
Beam slope vs. run number from offline
reconstruction Unofficial average (my eye
fit) 600 ?rad x vs z 150 ?rad y vs z
38SVT Performance (VII)Understanding of Impact
Parameter Resolution
- Best detector performance 45 micron
- from Comm.Run data offline analysis
- single wedge/barrel
- corrected for beam tilt and detector alignment
- Detector misalignment 50 micron
- from Comm.Run. analysis
- Beam Tilt 54 micron
- from current data
- Wedge-to-wedge misalignment 67 micron
- from current data
- Best width without XFT 67 micron
- Best width in present data (no corrections) 52
micron