CDMS%20Computing%20Project%20Don%20Holmgren - PowerPoint PPT Presentation

About This Presentation
Title:

CDMS%20Computing%20Project%20Don%20Holmgren

Description:

Mike Crisler. Analysis: Erik Ramberg. Engineering: Stan Orr, Lou Kula, Rich Schmitt ... FNAL CD Personnel: Don Holmgren, 25% Deliverables: ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 9
Provided by: cddocd
Category:

less

Transcript and Presenter's Notes

Title: CDMS%20Computing%20Project%20Don%20Holmgren


1
CDMS Computing ProjectDon Holmgren
  • Other FNAL project members (all PPD)
  • Project Manager Dan Bauer
  • Electronics Mike Crisler
  • Analysis Erik Ramberg
  • Engineering Stan Orr, Lou Kula,
    Rich Schmitt

2
Introduction
  • CDMS (Cryogenic Dark Matter Search E891) An
    experiment using germanium and silicon detectors
    operated at low temperatures (90 mK) at the
    Soudan Underground Facility to search for low
    energy nuclear recoils characteristic of
    interactions with candidate SUSY dark matter
    particles (WIMPs).
  • FNAL CD Personnel Don Holmgren, 25
  • Deliverables
  • Linux device drivers and user space code to
    control VME and PCI-based instrumentation
  • An event builder integrated with slow control
    from UCSB
  • Assistance with computing infrastructure (DAQ
    cluster management, code management,
    configuration management)
  • Raw and processed data repositories at FNAL
  • Capability to reprocess raw data at FNAL

3
Experiment Status
  • Official data taking began at Soudan October 11
  • Running with 1 tower (3 silicon, 3 germanium
    detectors)
  • 2 weeks downtime just began (1/12/04)
  • Will start running 2 towers 1/26/04
  • Various DAQ software upgrades testing now
  • Background running until July 2004
  • Then, two months downtime to install 3 more
    towers
  • Significant DAQ expansion required
  • Funded through late 2005
  • Collaboration will apply for follow up project

4
DAQ Architecture
  • Hardware Architecture
  • 1 VME crate per two towers of digitizers
  • 1 VME crate for veto
  • Crate PC's connect w/ Event Builder PC via
    point-to-point GigE
  • Shared FC disk drive, read-only by Data Server to
    write tapes
  • 1 PC dedicated to monitoring, GPIB, High Voltage
    control
  • Software Architecture
  • Event Builder
  • Written at FNAL
  • Includes various device drivers
  • C
  • Based on R2DM, which implies dependency on
  • ACE
  • ITC
  • thread_util
  • ZMutility
  • Slow Control, etc...
  • Written at UCSB
  • JAVA

5
JAVA GUI samples from UCSB Run Control
6
Data Repository
  • Collaboration has requested data storage at FNAL
  • Data rates
  • Background, 1 tower, 0.1 Hz 30 GB/month
  • Background, 2 towers, 0.1 Hz 60 GB/month
  • Calibration, 1 tower, 15 Hz 6.5 GB/hour
  • Calibration, 2 towers, 7 Hz 6.5 GB/hour
  • Sept-early Dec data has been copied over Internet
    from Stanford to Enstore
  • Total of only 30 GB total so far
  • Likely a few TB total by end of 2005
  • Use SDLT-320 tapes for raw and interchange data
  • Have started importing remaining data directly
    from tape
  • Will use test stand on WH11NW as tape import
    facility

7
Data Processing
  • Existing analysis framework is based on MATLAB
  • Collaboration operates modest analysis clusters
    at Stanford and at Soudan (8 machines each)
  • We would benefit greatly from the ability to
    reprocess data on FNAL Farms (or other cluster)
  • Requirements for Run 18 (just ended)
  • 5E5 background events, 1E6 calibration events
  • Raw data 80 KB/event
  • RQ's 9 KB/event
  • RRQ's 1.5 KB/event
  • MATLAB is not an option (too many )
  • However, can use compiled pipeline code (SLAC
    license)
  • CPU requirement about 4.5 seconds/event
  • 2000 CPU hours for all of Run 18
  • Probably 2 full reprocessings between now and
    June
  • Testing compiled pipeline codes now using spare
    cycles on old (non-SciDAC) lattice QCD cluster
  • Paul Mackenzie is willing to host reprocessing
    activity

8
Conclusions and Planned Work
  • Have just ended a successful run with 1 tower of
    detectors, and will start a 6 month run with 2
    towers soon
  • DAQ software development is in maintenance mode
    until July, when additional work will be required
    to support 3 more towers (2 man-months)
  • We do depend on some FNAL software products
  • R2DM, ITC, thread_util, ZMutility
  • Also one commercial package ORBacus
  • Have not encountered a bug in these to date
  • Will freeze DAQ systems at RH 7.3
  • Do not anticipate requesting support
  • Data import into Enstore has started
  • Over the next months will establish the ability
    to do data reprocessing at FNAL
Write a Comment
User Comments (0)
About PowerShow.com