Information Discovery from Automated Analyses of Stellar Spectra - PowerPoint PPT Presentation

1 / 10
About This Presentation
Title:

Information Discovery from Automated Analyses of Stellar Spectra

Description:

... the extraction of physical information from reduced data ... Physical Properties. Libraries of. model spectra. COMPUTING COST. Depending on the type of model, ... – PowerPoint PPT presentation

Number of Views:71
Avg rating:3.0/5.0
Slides: 11
Provided by: starA
Category:

less

Transcript and Presenter's Notes

Title: Information Discovery from Automated Analyses of Stellar Spectra


1
Information Discovery from Automated Analyses
of Stellar Spectra
Simon Jeffery Armagh Observatory
2
Rationale
  • Data inundation e.g. Spectra for star clusters,
    extragalactic populations, star-forming regions,
    ...
  • Analysis is costly
  • Analysis technology is difficult to implement
  • Legacy data better interpreted by tomorrow's
    physics
  • Full exploitation of the information explosion
    will only be achieved through e-science
    initiatives which include an appropriate
    realisation of the requirements of data analysis

3
Definitions
  • Data Reduction -the conversion of raw
    counts(pixel) to calibrated flux(wavelength)
    data
  • Data Analysis - the extraction of physical
    information from reduced data

4
Data Acquisition
Science Goal?
Object Identifier(s)
Required observable?
Archive Data
New Observations
Wavelength
Data Reduction
Resolution
Sufficient Data?
Signal-to-noise
Data Analysis
5
Data Objectives
Spectral Type
Radial Velocity
SED
Extinction
Temperature
Composition
Gravity
Rotation
Binaries
Variable?
Spots
Flares
Evolution
Pulsation
6
Data Analysis
ATOMIC DATA The interpretation of astronomical
spectra is only as good as the atomic data it
uses. Most r/t codes require large volumes of
atomic data.
Physical Parameters
RADIATIVE TRANSFER CODES The choice of model
atmosphere and/or radiative transfer code
dep- ends on the project. All are expensive to
install and maintain and most are costly to run,
requiring large amounts of memory, cpu and
atomic data
Atomic Data
Model Atmospheres
Libraries of model structures
COMPUTING COST Depending on the type of model,
the cost of computing one spectrum ranges from
a few seconds on a 1 Ghz PC to several days on
HPC's.
Radiative Transfer
Libraries of model spectra
Observational Data
MODEL LIBRARIES To process large volumes of data,
robust fitting techniques are essential. These
require large libraries of models in a
subtantial parameter space (100 Gb). Real-time
r/t calculations are prohibitively expensive.
Fitting Software
Physical Properties
7
Automated Analysis of Hot Star Spectra
composition
High-resolution spectrograms
UV and visual spectrophotometry
STERNE
model structure grid
model flux grid
TFIT
v_t
SPECTRUM
Teff, E(B-V), q
Teff, E(B-V), q
high-resolution model grid
SFIT
Atomic Data
Teff, g, n_x, v sin i
LTE_LINES
SFIT_SYNTH
v_t, composition
8
Spectral Fine Analysis of V652 Her
  • Observing Requirement 55 spectra - S/N100 3
    hours WHT
  • Computing Requirement using 600 Mhz Alpha
  • Physics LTE, plane-parallel hydrostatic
    equilibrium
  • Model Grid - 1 composition - 90 models 3 days
  • Spectrum Grid - 1150 lines - 1 composition - 1
    microturbulence 0.5 day
  • x4
  • Automatic fits for Teff, log g 30 minutes for
    55 spectra
  • Automatic fit for 3 chemical abundances 1 hour
    for 1 spectrum

9
Network Implementation
Atomic Data
Science Goals
Dbase Server
Fitting Software
Interrogation Software
Data Archive
Modelling Software
HPC
Dbase Server
Applications Node
Data Archive
Modelling Software
Dbase Server
HPC
Applications Node
New Observations
Model Archive
Telescope
Dbase Server
Science Outputs
10
Conclusions
  • Data imported from telescopes, Virtual
    Observatory, database servers
  • Modelling software must run on HPCs (cpu, memory,
    atomic data).
  • Modelling software (eg ATLAS, TLUSTY, PHOENIX)
    need not be at same sites as data (maintained by
    different specialists)
  • Scientist controls selection of databases to
    interrogate, datasets to analyse, methods of
    analysis, but NOT the location of computing.
  • Requirements
  • harness for linking software to GRID
  • new protocols for grids of theoretical models
  • additional automated fitting applications
  • Expertise in use and development of existing
    radiative transfer programs and in collaborative
    development of a distributed software library
    from Armagh Observatory and a CCP7 derivative
Write a Comment
User Comments (0)
About PowerShow.com