Title: What is happening in ALICE
1What is happening in ALICE ?
- PPR done, but PMD part not satisfactory
- First physics week at Erice (Italy) in the first
week of this month - Physics Board gearing up for Day-1 pp physics
- LHC collisions still almost in time, may slip
by a couple of months at worst - Magnetic field measurements completed, data
analysis in progress - TPC in good shape, trigger detectors have to work
hard - New naming convention for areas (see next slide)
- New naming convention for patch panels
- New convention for magnetic field polarity
2Areas in UX
- Comply with LHC convention
- Change in naming of rack areas
- A anticlockwise, area under the PX, rack area C
- C clockwise, muon arm area, rack area D
- I inside LHC ring, rack area A
- O outside LHC ring, rack area B
3Areas in UX
4Magnet polarity
- Follow LHC convention
- Positive muon dipole field deviates beam 1
upwards - That is the B-field points from the LHC center
outwards - This is (of course) opposite to what we used as
convention during the field mapping - L3 field points from C to A (anti-clockwise)
52) Final Positions for Forward Detectors
Detector PPR Simu TDR
Final T0 -3500
-3500 -3600 -3746 PMD -3615
-3615 xxxxx -3615 V0 -3550
-3400 -3400 -3275 FMD1 -3400
-3200 -3200 -3200
T0 Final position is defined by layout
requirements. O.K. with T0 group as long as no
additional material is in the direct line of
sight to the IP, which is the case. PMD
Unchanged V0 Moved forward to reject beam-gas
events. Position is O.K. for V0 group. FMD1 Had
to move forward with V0. O.K. for FMD group.
62) Final Positions for Forward Detectors
Distance to IP
V0
PMD
3250 mm
3300 mm
3545 mm
81.5mm
3685 mm
3766.5 mm
7Preparing ALICE for Physics
-
- Calibration procedure
- Positioning survey alignment
- Data checking
- Display visualization
- Corrections
- Analysis tools
- Statistics needed
- No field data, B-field
- Luminosity
- Task force
-
8Technical tools
- Many technical tools have to be prepared in
advance - Detector calibration procedure
- Detector alignment (positioning survey
alignment) - Magnetic field (measurement map
parameterization) - Reconstruction (TPC ITS different
combinations of detectors) - Event display
- Event analysis (ESD information special cases)
- Definition of vertex primary secondary
- Definition of primary track secondary track
- Strategy for correction
- Efficiency corrections
- Acceptance corrections
- Trigger and trigger corrections
- Luminosity control and measurement
- Normalization for pp (inelastic vs.
non-diffractive)
9Positioning
- Before beams comes
- Positioning (how precisely we have to put the
pieces together) - Simple answer as precise as possible
- absolute given by IP precision and beam axis
- IP varies fill by fill few mm so this is the
requirement - Beam axis angle lt 0.1 mrad practically
impossible - So use physics requirement in this case pt and
pseudo-rapidity precision (we aim for 0.5 mrad) - relative given by detector precision
- the scale is given by space-point reconstruction
precision - TPC 0.3 1 mm
- ITS 10 30 µm (in z more relaxed)
- Aim of positioning precision not to create
problem during reconstruction - example the axes of ITS TPC better than
- 0.5 mrad
10Survey
- Survey precision is the ultimate absolute
precision we will get - example suppose we did a large error, and TPC is
tilted by 100 mrad (one side 50 cm higher then
the other) - using just TPC tracking there is no way to see it
- we will just align everything in the rotated
system - the same happens with the shift of the coordinate
system for absolute position of the system we
have to assume something - this will give us the ultimate precision of the
momentum components, angles etc. - we can improve later a bit using physics
(symmetries etc.) - needs perfect understanding of detector
efficiencies etc. - Survey relative measurement not worse than
order of magnitude of the precision of better
detector - otherwise will be difficult to start tracking
alignment - example ITS with respect to the TPC 100 µm
11Calibration
- Before to start the final alignment we need to
calibrate the detectors detector dependent
detector responsibility - there are two types of detectors
- solid type the precision is given given by
technology (silicon pixel, silicon strip ) - gelé type basically drift detectors, its hard
to get them calibrated, and they change with
temperature, pressure, etc. (TPC, silicon drift
) - its always better to put the second type in
between the solid type - depending on the type there is different
reliability for alignment - example you see some movement for TPC you can
always blame it on something else (on the 100mm
level) - for this type of the detector is difficult to
align them better the survey precision - TPC maybe still OK (but ???)
- silicon drift detectors will be very difficult
12Calibration
- different calibrations
- during running
- after reconstruction
- needs for special calibration runs
- estimate the statistics needed
- we may get surprises
- simple estimate for slewing calibration (time
amplitude dependence) of TOF results in first
year of pp data (108 events) - without that we have precision 300 400 ps
not too useful - we have to explore other ways
- we have to define calibration procedure for each
detector not to find surprises later
13Alignment
- With tracks better start straight
- problem we dont know the momentum
- we know that they are mostly low momentum
- significant multiple scattering (in comparison
with detector precision at least for ITS) - this we can use as the estimate of the momentum
(was done in emulsion experiments) residuals
from straight line are inversely proportional to
momentum - above 1 GeV we are dominated by detector
precision few of the tracks one track per
10 pp events - to get alignment on 10 of detector precision we
need 100 tracks per solid piece of the detector - ITS is most demanding
- we have of the order of 103 pieces we need
105 high-momentum tracks up to 106 events - to do it well one needs a good (10 h) fill at
100 Hz running !
14Alignment
- Tools for no-magnetic-field tracking
- Kalman filter (as it is) is not well suited for
this task - it will treat the misalignment as noise, i.e.
it will blame the shifts to scattering of the
track - better would be to use the knowledge that
curvature (1/R) is zero as constraint this
means to change the dimensionality of state
vector (from 5 to 4) - Two possibilities
- to modify Kalman accordingly
- to develop simple straight line (global) tracking
- (I personally prefer the second solution)
- Tools for shifting and rotating the pieces in
geometry - In new geometry there exists
- we have to move the reconstruction to geometry
modeler !
15Alignment
- The absolute precision after alignment will be
still given by survey precision, we will only
improve the relative precision - Then switch on the field
- problem the things may (and therefore will)
move - in barrel detector probably as whole will
change basically the absolute precision - therefore we have to require the survey also with
magnetic field - alignment has to be checked
- to that probably one 1/10 of the tracks (i.e. 105
events) now we know the momentum, we can account
for multiple scattering exactly - To get a correct alignment at any field settings
the minimum size of sample is 105 pp events
16B-field polarity
- Switch of the B-field polarity
- needed because the initial system is CP
asymmetric - we cannot use the symmetry for alignment
- and ALICE detector is by construction asymmetric
(outer pixel layer) - one of the observables we want to study is
asymmetry - any experimental cut on momentum (especially
high momentum) will introduce uncontrolled
systematic between particles and anti-particles,
if it is not checked with other B-field polarity - We will always need a substantial part of
statistics with the other polarity - to check the systematic one needs comparable
statistical errors - we need at least 10 of data with the other
polarity, better more, better do it equally
17Positioning Survey AlignmentCalibration
- three steps
- postioning survey alignment
- absolute relative precision
- define the requirements accordingly
- we need to define alignment calibration
procedure - the software for no-field data
- tracking
- momentum cut on no-field data (iterative
procedure, correlation of residuals) - shift and rotate all detector pieces to the
optimal position - define the histograms to control the alignment
after ramping the magnetic field - and maybe additional alignment
- online control of alignment (online event
reconstruction)
18Magnetic field
- Measurement done transform it to suitable
format - Clean up the data
- Test for smoothness, symmetries, Maxwell
equations, scaling with current - Calculate field from surface in
- Prepare the map define the grid
- from precision requirements dB/B 10-3
- from precision field gradient itself
- Interpolation procedure
- Detailed calculations with TOSCA
- Different polarities
- Different current settings for solenoid (0.2T and
0.5T) - Different configurations (solenoid dipole)
- Test 1 measurement calculations from Maxwell
eq - Test 2 measurement calculations from TOSCA
- Test 3 reconstruction of cosmic tracks
- One year job
19Data checking
- We have to define online data checking procedure
- Starting from detector based on data format
- Correlation between detectors
- Quality histograms
- Online calibration
- Online reconstruction
- Event display
20Reconstruction
- We have very nice and well performing
reconstruction - BUT
- Do we know what will be in the pit at time zero ?
- Do we know what will give reasonable data at time
zero ? - NO WAY
- Reconstruction has to be prepared and tuned for
any combination of active non-active detectors - For PMD, we can say that reconstruction has to
be prepared for any combination of
active/inactive chains
21Event analysis - ESD
- We are just starting
- There exist some definition of Event Summary Data
the result of reconstruction program - However there is a tendency to add and add new
data - Problem we cannot afford to overflow the size
(e.g. to write there raw data) - On the other hand we find missing information
e.g. recently for multiplicity analysis - We need massive involvement of physicists now
to find what is missing - Needs to handle special cases
- Special runs
- Subset of events
- Calibration data
22Definition of vertices and tracks
- This may depend on physics observables under
study - Nevertheless we need a default definition of
- Primary vertex
- Secondary vertices
- Primary tracks
- Secondary tracks
- Background tracks
- Vertex track association
- The work on this is under way
23Strategy for corrections
- Many choices to be done
- To ways to correct on track level
- event by event weight calculations (used for
low statistics observables) - pre-calculated correction map
- in which variables ?
- how to interpolate ?
- handling of coming in, coming out
- Spectra unfolding (multiplicity distribution, pt
spectra) do not depend on assumed distribution - different methods
- see talks of J.Conrad and C.Jorgensen
- Do we correct separately for efficiencies and
acceptance or we put everything in one pot ? - I shall argue (at least at the beginning) for
separate corrections
24Trigger and trigger corrections
- The (first) basic trigger for minimum-bias pp
collisions under study (J.Conrad, G.Contreras and
C.Jorgensen) - We need to study fall-back solutions redundant
triggers - What we shall do IF ?
- For each trigger we need to know
- The efficiency of trigger detectors
- The efficiency of trigger for given class of
events - The purity of trigger with respect to event class
- How this depend on assumed events in Monte Carlo
? - What special runs we need to estimate this
efficiencies experimentally ? - Which statistics we need for special runs and for
final trigger run ?
25Luminosity control and measurement
- Few methods under study
- Relative luminosity control
- Calibration with selected trigger may depend on
background condition in accelerator - Current control with dedicated equipment
- Absolute luminosity measurement
- Direct measurement of transverse beam size
- VDM scan (might be difficult in two dimensions)
- using precise vertex detector possibly
injecting gas ? - This needs dedicated study
26Normalization
- Correct normalization to definite event class
(inelastic events, non-diffractive events, etc.)
has to be defined - Already the first measurement of dNch/d? is not
easy - one may thing we just count the tracks, divide
by number of events and its done - even if we know what is track we have somehow
to correct the number of events to some well
defined class (for those which we dont trigger
and those which we trigger incorrectly) - this will be probably the largest source of
systematic error for first measurements number
of inelastic events ans number of non-diffractive
events may differ by 20
27Commissioning Task Force
- In order to be ready on time we need task force
(limited duration in time) - Can we select a dedicated task force which is
DEVOTED to ALICE physics and has NO greed for
anything else ?
28First paper
- It only takes a handful of events to measure a
few important global event properties (dN/dh,
ds/dpT, etc.). In fact, if LHC works, it will be
easy to collect a few tens of thousand events,
sufficient for first paper.
Claus Jorgensen
29Note on first physics J.-P.Revol
- While global event properties are the first
priority (ATLAS, CMS and LHCb will also do it
first), all detectors ready at the start of LHC
will of course make the best use of the first few
collisions ever observed at LHC ( 20 kevents) - First questions to all of us
- What early physics can we do with the first 20
kevents? Clearly this has to be thought about
ahead of time! - In the first month of running (November 2007?),
the average collision rate will be of order
70kHz, we can expect at least 70 Million events
recorded in the central region, assuming 20
shifts of 10 hours each and 100 Hz readout rate
in the central region (depending on maximum TPC
grid pulsing rate that can be achieved) - What physics can we do with the first large
statistics pp sample?