GLAST Large Area Telescope: - PowerPoint PPT Presentation

About This Presentation
Title:

GLAST Large Area Telescope:

Description:

Support for LAT Collaboration Science Groups. Support for ISOC ... Data Skimmer. Select events based on 'TCut' able to access full merit tuple (400 columns) ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 15
Provided by: wwwconfSl
Category:

less

Transcript and Presenter's Notes

Title: GLAST Large Area Telescope:


1
GLAST Large Area Telescope Science Analysis
Systems and Collaboration Computing Needs Robert
Cameron, Richard Dubois Stanford Linear
Accelerator Center
2
Outline
  • SAS Overview
  • Service Challenge update
  • Support for LAT Collaboration Science Groups
  • Support for ISOC Operations Testing
  • Computing Resource Projections

3
SAS Fusion of HEP Astro
Science Tools
  • Collection of tools for detection and
    characterization of gamma-ray sources (point
    sources and extended sources)
  • source finding
  • max likelihood fitting (binned/unbinned)
  • parameterized instrument response
  • exposure maps
  • comparisons to model (observation sim)
  • periodicity searches, light curves
  • Science Tools are FITS/FTOOLS based
  • for dissemination to astro community

Event Interpretation
full code development environment on linux,
windows (mac imminent), code and data
distribution, automated code builds,
documentation etc etc
Full simulation/reconstruction of 1 GeV gamma
4
Data Challenges
Data challenges provided excellent testbeds for
science analysis software. Full observation,
instrument, and data processing simulation. Team
uses data and tools to find the science.
Truth revealed at the end.
  • A progression of data challenges.
  • DC1 in 2004. 1 simulated week all-sky survey
    simulation.
  • find the sources, including GRBs
  • a few physics surprises
  • DC2 in 2006, completed in June.
  • 55 simulated days (1 orbit precession period) of
    all-sky survey.
  • First generation of LAT source catalogue
  • Added source variability (AGN flares, pulsars).
    lightcurves and spectral studies. correlations
    with other wavelengths. add GBM. study detection
    algorithms. benchmark data processing/volumes/rel
    iability.
  • 200k batch jobs - worked out reliability issues
    (lt 0.1 failure rate now)

5
Post DC Service Challenge
  • Coordinate simulation studies
  • will need a common set of simulations plus a
    near-constant stream of simulations to support
    special studies. Develop capabilities outside
    SLAC as needed using collaboration resources.
  • Operations readiness testing coordinated with the
    mission-level End-To-End tests.
  • leverage off the ETE tests for internal-to-LAT
    readiness
  • a sequence of service challenges for readiness
    testing serves these needs better than what is
    needed for systematic studies by science topic.
  • Organize by area
  • Science groups, led by LAT Analysis Coordinator
  • ISOC, led by ISOC managers

6
SC Work to be Done Responsibilities
  • Systematic Sensitivity Studies
  • pt sources, extended sources, transients upper
    limits
  • diffuse analyses
  • variability (incl. pulsars)
  • neighboring sources
  • flaring diffuse effects
  • focus on 1st papers analyses
  • Operations Readiness Tests
  • digital data problems
  • instrument problems (bad channels, wrong rates,
    recognizing a few wrong constants, )
  • Automated science processing
  • receiving data dumps, running the pipeline,
    benchmarking resources and times, reliability
  • idiosyncrasies vs. problems
  • day(s) in the life
  • performance monitoring
  • documentation

ISOC
Analysis Coordinator and Science groups
  • Other Studies
  • PSR (handoff review) performance
  • analysis tuning (signal/bkgd, quality knobs by
    topic)
  • update simulation (s/c model, tune from beam test
    and IA data)
  • first light observations (simulate point, then
    scan) early ops analyses
  • effects of burst repoints
  • sky survey strategy checks
  • background fluxes evaluation early ops

CA group and ISOC jointly
Collaboration participation needed
7
SC Connection to Science Groups
  • Several datasets have been identified for the
    Science Groups use
  • 1 year survey simulation, using obssim science
    tool (completed)
  • Fast Monte-Carlo with parameterized responses and
    efficiencies
  • Early testing of sky model opportunity for
    Science Groups to exercise analyses on realistic
    sky with long observations
  • 55 day simulation using GLEAM (LAT event
    simulator) (imminent!)
  • Full simulation
  • Earth occultation not currently in exposure
    calculations
  • Autonomous Repoint Requests
  • LAT/SC misalignment
  • Background interleave for pointed observations
  • 1 year GLEAM simulation
  • Final pre-launch science performance
  • Potentially huge backgrounds run needed
  • Targeting use of Lyon, Italian computer farms
  • Plus a few smaller scale specialty runs as needed
  • Plan on delivering obssim and 55 day runs for the
    end-July LAT collaboration meeting 1 yr Gleam
    run in August.
  • Milestone for next versions of Data Catalogue,
    LAT Data Servers

8
SC Operations Testing for the ISOC
  • Strategy defined from ISOC Science Operations and
    Service Challenge workshop
  • Use ETE tests for control room type functions
  • Shift log, Level1 pipeline, Data Catalogue,
    Monitoring
  • SAS products
  • Use simulations to prep for ETE provide
    realistic science data, extended running
  • Simulate Level0 science data
  • Prep for ETE Level1 pipeline usage
  • Realistic science data for response
    distributions, resource usage, latencies etc
  • Downlink simulations for instrument readiness
    tests, such as calibrations, failed sensors etc.
  • Demonstrated, but need background interleave
    scheme for big datasets
  • 55 day, 1 year orbit runs
  • Extended run to test Automated Science Processing
  • Time trending of instrument quantities

9
Simplified Diagram for ISOC Data Flow
- testable with simulations
Ingest L0 data
S
FastCopy
MOC
S
S
L0 Archive
S
Create data analysis ntuples
Data Receiving
Calibrations and L1 processing
S
Create recon files and perform event
classification
Merge Events Extract Context Extract EBF
Create digi files
S
S
Automated Science Processing
S
Analyze charge injection data (LCI)
Get Calibration from DB
Analyze calibration data (LPA)
S
S
Output Data Products to LAT Collaboration and
GSSC
MOOD/MOOT (config DB)
10
Data Access LAT Data Portal
  • Provide collaboration access to both summary
    photon data and full digi/recon/(MC) data
  • Provide data in both FITS and Root format
  • Main components
  • Astro Data Server
  • select events based on position in the sky,
    energy, time, or event class
  • Data Skimmer
  • Select events based on TCut able to access full
    merit tuple (400 columns)
  • Access full data for list of runs/events
  • Event Display (WIRED)
  • View detailed detector response for list of
    runs/events
  • Data Catalogue is underpinning with all the
    dataset bookkeeping
  • File locations
  • Flexible user definable meta data

http//glast-ground.slac.stanford.edu/DataServer/d
c2/
Under revision for the 55 day run
11
Current Computing Resources at SLAC
  • Starting 3rd year of projected annual 300k
    Capital Equipment Projects
  • Supplying batch farm disk CPU, as well as
    dedicated servers
  • Optimize purchases based on best deals SCCS can
    come up with
  • 150 TB disk (45 TB still available)
  • LAT Commissioning
  • DC2/SC
  • LAT Beamtest
  • Infrastructure needs (code builds system tests
    user disk)
  • Tremendous use of SLAC Batch farm
  • 160 cores (40 dual core, dual CPU boxes) owned by
    GLAST
  • Leveraged to gt 300 cores during extended
    simulations runs
  • Will have 400 cores at SLAC
  • Not looking good for quad core CPUs to be
    available this year

12
Known Liens on Resources
  • GLAST/LAT Data taking at General Dynamics
  • EMI/EMC testing is underway for 30 days
  • Thermal-vacuum testing in late summer, 40 days
  • 55 Day run
  • One week processing time 5 TB disk
  • Service Challenge 1-year run in August
  • 40 days running _at_ 300 cores
  • 30 TB disk (Note disk space needs are reduced
    compared to on-orbit data taking due to use of
    background interleave scheme)
  • ETEs
  • Small data volumes on this scale
  • Launch
  • Estimate 100 cores needed to process a 3-hour
    downlinked dataset in about an hour
  • 400 cores will provide a pool of cores for prompt
    processing monitoring reprocessing and
    simulations
  • Will order 150 TB disk to be on hand at launch

13
Computing Planned Acquisitions
  • SLAC
  • Order for 50 TB disk and 240 cores in process
  • ship date is end June (from Sun) to be installed
    mid July
  • Additional 150 TB to be acquired for launch
    readiness
  • Funds from SLAC LAT Operations Collaboration
    Fund
  • Univ of Washington (in use now)
  • 100 physics dept lab CPUs on cycle-available
    basis (when students are not using them)
  • Used for CPU intensive simulations
  • Lyon
  • IN2P3 is providing 100 CPUs, 50 TB disk
  • porting LAT processing pipeline infrastructure
    (Pipeline2) to Lyon now
  • CNAF
  • INFN has submitted proposal for 100 CPUs, 25 TB
    (in 07)
  • Approved
  • Will be required to access via GRID tools

14
What to Take Away
  • Service Challenge End-To-End tests
  • Being used to hone the tools, complete
    development and test end-to-end operations
  • Computing resources prudent approach is being
    implemented
  • Acquire 400 cores at SLAC available for GLAST
  • Lesson learned from 5-ring circus of DC2, LAT
    Beam Test, IT
  • Keep full event details on disk in 08 175 TB
  • GLAST will do better science the more compute
    power it has access to
  • Have not hit the plateau yet!
  • Extending LAT processing pipeline to France and
    Italy
Write a Comment
User Comments (0)
About PowerShow.com