GLAST LAT Collaboration Meeting - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

GLAST LAT Collaboration Meeting

Description:

STUFF I WILL MUMBLE ABOUT BRIEFLY. How we load them into the metadata database ('rdb' ... Mumblings inspired by Zach's talk (1 of 2) ... Mumblings (2 of 2) ... – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 17
Provided by: WNeilJ9
Category:

less

Transcript and Presenter's Notes

Title: GLAST LAT Collaboration Meeting


1
Recon at the Single Crystal Level
  • STUFF IM NOT GONNA TALK ABOUT (Zach covered
    this)
  • What calibGenCAL executables read which files
    from Online suites to generate which calib.xml
    files
  • What are the different calib.xml files
  • STUFF I WILL MUMBLE ABOUT BRIEFLY
  • How we load them into the metadata database
    (rdb)
  • How to make your jobOptions file pick them up
  • Various sundry.
  • STUFF I WILL TALK ABOUT
  • How data looks after its been through this whole
    mill.
  • More specifically, extrapolate TKR track into CAL
    to check the CAL positions and energies at the
    single crystal level.

2
Mumblings inspired by Zachs talk (1 of 2)
  • What we decided about flavor and instrument
    usage in the rdb database -- see Heathers slides
    8 9 from Feb 24 Ground Software meeting,
  • http//www-glast.slac.stanford.edu/software/meetin
    gs/FlightIntSAS-20050224.pdf
  • In addition to what Zach explained about
    calibGenCAL, see Mark Strickmans eloquent
    document at
  • http//heseweb.nrl.navy.mil/glast/CAL_ATDP/FM107/S
    ummaryData/LAT-TD-05595-0120calibGenCAL_v3_descri
    ption20init20release.pdf
  • I also found it searching Lat docs
  • https//oraweb.slac.stanford.edu8080/pls/slacquer
    y/bbrdownload/LAT-TD-05595-01-calibGenCAL_v3_descr
    iption_init_release.pdf?P_FRAMEGLASTP_DOC_ID167
    16

3
Mumblings (2 of 2)
  • If you want to recon yourself and control which
    calibration constants get used as Zach said, he
    provides a
  • CalXtalResponse/src/defaultOptions.txt
  • Anders provides a broader range of examples in
  • LatIntegration/HEAD/src/jobOptions/pipeline/.txt
  • Look at your jobs output (with info
    OutputLevel set low enough), Joanne prints out
    which file(s) she found used.
  • As Zach said, amusing flavor names in calib_user,
    which it a user test database. But in calib we
    only use adult, professional flavor names
  • (like vanilla).
  • Eduardo said Id tell you about Freds wonderful
    trending tools. If you insist I could comment on
    the 14 and 21 Jan 2005 presentations that I made
    to Eds Friday Instrument Analysis meeting.
    (one slide in present talk)

4
FRED event display
Require One Only one TKR track. Use direction
cosines from end of track.
Run 140001339
5
Code, files, cuts
  • Am using RootTreeAnalysis.cxx on the recon (
    digi) files of run 140001339. 102,000 events.
    (Also looked at 140001338 140001340.)
  • For Monte Carlo, using 180,000 surface muons
    from Anders et al.
  • Require exactly one track. Extrapolate into CAL
    and predict hit positions for each CAL layer.
    Compare with CAL position from CalXtalRecCol.
    Presently require 8,9, or 10 CAL hits, which is
    not a very good cut -- work in progress.
  • Extrapolation algorithm from Benoit Lott
  • Recall that on even (odd) layers you see end
    (length) of CsI log.
  • (Top layer readout at X end, light asymmetry
    gives Y coordinate.)

6
Extrapolated TKR vs measured CAL positions Real
data -- X direction
7
Extrapolated TKR vs measured CAL positions Real
data -- Y direction
8
Extrapolated TKR vs measured CAL positions Monte
Carlo simulation -- X direction
9
Extrapolated TKR vs measured CAL positions Monte
Carlo simulation -- Y direction
10
Single channel energy distributions
Require that either dx lt20 mm or dy lt 20 mm
not a very good cut either
Fit landau, keep MPVs. Here, real data. Monte
Carlo simulation looks similar. Multiply by
Zdir (to correct for zenith angle)
Fit between 7 and 20 MeV
11
Landau Most Probable Values
Run 140001339
Monte Carlo simulation
10.9 (10.7 to 11.1) MeV
11.1 (11.05 to 11.15) (calibrations based
on 11.2 MeV) Gains from NRL, not flight version
TEM/TPS. Slightly different 3.3 volts makes 1
gain change. Even bigger pedestal shift but we
re-did pedestals. Also, issue of a wrong delay
value
12
Trending output after Power Supply swap
Page 31 of Fred Pirons and Sylvain Guiriecs CPT
Trending output, http//www.slac.stanford.edu/dsm
ith/FM104_Sbay_HistSumm.pdf
13
Single channel energy distributions -- Monte
Carlo
Here, real data. Monte Carlo simulation looks
real similar. Fit landau, keep MPVs.
sigmas for these channels, Real 0.96 0.76
0.72 0.85 MeV MC 0.63 0.60 0.67 0.69 Was
noticed in GSI data. Work in progress
14
Occupancy of single CDEs
  • Number of entries for each of the single crystal
    energy distribution.

Qualitatively similar to simulation. Will start
quantifying missed extra hits.
Layer
Column
15
About LAC and FLE thresholds
  • 25 dac counts is what after all? Warren said,
    soon to config report.
  • Below, by Mark Strickman

16
Conclusions
  • 2 accuracy straight out of the box -- hats off
    to Eric Sasha Zach!
  • Much of 2 discrepancy understood
  • Tomorrow IT will run calibDac, calibGen,
    muonCalib
  • gt update database and iterate.
  • Improve my CsI log selection (Benoits better,
    see his talk).
  • Look at tracks near log ends (direct
    illumination, particle in photodiode, etc.
    Andrey)
  • Look at trigger bits using TKR extrapolation
    (Luis)
  • Tally missed unexpected hits.
  • Various fun studies conceivable -- ask me for
    weird ideas.
  • Et cetaera and so forth (und so weiter und so
    fort)
Write a Comment
User Comments (0)
About PowerShow.com