Title: ???, IHEP
1????????????????????
- ???, IHEP
- On behalf of
- ??????
- DAQIHEP
2sin22q13 Measurement at reactors
Well understood, isotropic source of electron
anti-neutrinos
Detectors are located underground to shield
against cosmic rays.
1.0
Unoscillated flux observed here
Probability ?e
Distance (L)
1500 meters
3Dayabay Reactor Neutrino Experiment
900 m
465 m
607 m
810 m
292 m
Total Tunnel length 3000 m
4Detecting ??e
- Inverse ?-decay in 0.1 Gd-doped liquid
scintillator - Antineutrino signal, algorithm implemented in
offline or on online computer farm - Time coincidence
- Energy correlated
5Requirements of readout electronics
- Readout board designed for all detector systems
except RPC. - Neutrino detector
- Charge measurement
- Dynamic range for each PMT 0 PE -- 500 PE
- 50 p.e is the maximum for a neutrino event
- 500 p.e. for through-going muons
- resolution lt10 _at_ 1 p.e, 0.025_at_ 400 p.e.
- Noise lt 0.1 p.e.
- Digitization time(mainly shaping time) lt 1 ms
- Timing measurement
- To determine event time and event vertex
- dynamic range 0 500 ns
- resolution lt 500 ps
- Muon detector
- Water pool Same requirements as neutrino
detector - Water Tracker Hit and/or charge
- RPC BESIII electronics
6Readout board diagram
On board calibration circuit
TDC algorithm Gray counters
16 Channel inputs
7Trigger requirement
- Good background rejection power
- rate can go to KHz
- rate limited by DAQ capabilities
(hopefully lt 10 MB/s/module) - Low threshold ( T3s lt 1MeV )
- Record prompt positron signals and delayed
signals from the neutrino interactions. - Record the background to enable background
analysis. - High and well known efficiency
- Flexibility (to fight backgrounds), same trigger
board for different detector. - FPGA
- Daughter card
- Reliability (to reduce systematic errors)
- Independency, Separate trigger for each of
neutrino module, and each of muon detector, water
pool, water cenrenkov module and RPC. - Redundancy (to measure the trigger efficiency)
- Provide a system clock
8Algorithms
- Central trigger OR of the following two
- Energy total charge gt 15 PE
- Multiplicity gt 15 PMT fired
- Veto trigger OR of the following two
- RPC gt two hits in any plane (Scin. gt 1 hits )
- Water gt few PMT fired
- Prompt and delayed sub-event triggered and
recorded independently, time correlation offline - Central and veto events triggered and recorded
independently, time correlation offline - No dead-time induced. Trigger rate is limited by
electronics recover time and by DAQ bandwidth. - Trigger type
- Primary physics trigger
- LED
- Radioactive source
- Periodic trigger
- Muon trigger
9PMT dark current rate
- PMT max. number 200
- PMT dark current rate50k
- Integration windows100ns
10Trigger rate
Detector Event Trigger Rate Trigger Rate Trigger Rate Occ. Ch size (bits)
Detector Event DB LA Far Occ. Ch size (bits)
??e Module Cosmic-µ 36x2 22x2 1.2x4 100 22464
??e Module Rad. 50x2 50x2 50x4 100 22464
Pool Cosmic-µ 250 160 13.6 50 340(252)64
Tracker Cosmic-µ 1390 819 57.8 100 864
RPC Cosmic-µ 260 260 415 10 7650(5040)1
RPC Rad.Noise 186 117 10.5 10 7650(5040)1
Site totals kB/s 653 483 419 1555 1555
11Trigger board
- One board per module
- Same hardware design for central and veto board
- Each trigger board can handle up to 256 PMTs
- Decision time ? Readout event buffer depth
- Multiplicity trigger based on FPGA 200 ns
- Energy trigger based on total charge 300 ns ?
- System clock Local clock
12Trigger scheme
13Timing
- Each site has a master clock to synchronize the
veto and central modules - A GPS time/1 PPS/10 kHz reference will be
delivered to each site for an absolute time stamp - If GPS can not be used, we can use a local clock,
a problem for Supernova studies. - Precision GPS time 100 ns. Time-stamp
precision level 25ns. - Each trigger board have a local clock for self
trigger and testing.
Mid-site
DYB
LA
FAR
14DAQ block diagram
15Data acquisition online control
- VME based front-end hardware, Motorola PowerPC
controller - RT-Linux RTOS TimeSys Co. LinuxLink
- Back-end Linux PC. Software based on
BES-III/Atlas Framework - Each detector system and each neutrino module at
each site is readout (trigger) independently. - 8 antineutrino streams and 9 muon streams. Event
reassembled using timestamp offline. - One neutrino module ? One VME crate.
- Water pool ? One VME crate (near), Two VME crates
(far). - Water Cerenkov Module system TWO VME crates.
- Communication
- Copper between trigger-FEE
- Twisted cable between PowerPC and readout
computer - Optical between site-site/site-surface.
16Data acquisition and online control
- Online control
- Local online control in each detector hall each
detector system has its own online control.
Detector debugging and commissioning in parallel. - Global online control in surface room Operate
and monitor all detector system. - Data storage
- Data throughout 1.5MB/s. ? 0.4TB/day (safety
factor of 3). - Local tapes
- Local disks
- Data transfer
- Tapes from DYB to IHEP or to Shenzhen Uni.
- A data link from Shenzhen Uni. to IHEP via
network can be discussed - A data center at IHEP to be established. Raw data
or processed data tapes will be copied and
shipped to other data centers in the world, or
distributed via GRID.
17One Readout VME crate
18Status
- Simple version of readout boards and trigger
board are successfully running on the prototype. - We are working on the 2nd version of readout
board and trigger board - We finished conceptual design of DAQ
- DAQ group is formed and begin to work on the
project