Australian Virtual Observatory A distributed volume rendering grid service PowerPoint PPT Presentation

presentation player overlay
1 / 13
About This Presentation
Transcript and Presenter's Notes

Title: Australian Virtual Observatory A distributed volume rendering grid service


1
Australian Virtual Observatory A distributed
volume rendering grid service
  • Gridbus 2003
  • June 7 Melbourne University
  • David Barnes
  • School of Physics, The University of Melbourne

2
Overview
  • what is a virtual observatory?
  • astronomy data cubes 101
  • volume rendering
  • distributed data volume rendering
  • turning it into a grid service
  • future projects

3
Virtual observatories
  • bring legacy astronomy archives on-line and
    ensure future project compliance
  • describe data fully, and support a finite,
    well-chosen set of interoperability protocols
  • develop tools and interfaces to find, acquire,
    process and visualise data
  • build national and international grids and embed
    the data, tools and interfaces in those grids

4
Astronomy data cubes 101
  • you may have only seen 2d astronomy images
  • an increasing number of telescopes and
    simulations produce multi-dimensional data
  • astronomy data cubes are 3d arrays of pixels
    (voxels)
  • typically the axes might be latitude and
    longitude on the sky, and frequency of radiation
  • lots of information!

Radio frequency
Declination
Right ascension
5
Volume rendering
  • 3d data can be viewed in slices, or we can render
    lines of sight through the entire volume - this
    is volume rendering and may offer new insights to
    complex data collections

6
Distributed data volume rendering
  • split large volume into smaller pieces
  • share the pieces out to nodes of a Beowulf
    cluster
  • on demand the nodes render their piece of data
  • other nodes glue the pieces together to form the
    final image
  • provides increased speed and ability to handle
    larger-than-memory volumes
  • See Beeson, Barnes Bourke, 2003. PASA,
    submitted

7
Distributed data volume rendering
  • Rendering controlled by a remote client connected
    on a socket
  • Joint project with AstroGrid (UK) to recast the
    software as a grid service for demonstration in
    July at a major astronomy conference in Sydney.

8
Making a grid service
  • Collaborating groups now include
  • Melbourne (Physics Computer Science / SE)
  • AstroGrid (Cambridge, Leicester)
  • VPAC, APAC, CSIRO CMIS, , as data centres and
    rendering clusters
  • Lead is being set by Guy Rixon (Cambridge) who
    has designed the system and is managing the
    project plan day-to-day
  • Why?
  • Saves you from fetching large data files
  • Enables use of distributed computing resources
  • Demonstrator of grid technologies for VOs

9
Structure
  • Portal provide an interface for the user to find
    and select data and to select a rendering cluster
    (80 complete)
  • Data centre service provides a registry of its
    data holdings and some tools to eg. extract
    sub-images (60)
  • Data centre runs a gsiftp server to provide
    authenticated access to the data (100)
  • Cluster centre service fetches the data, starts
    up a rendering tree, loads the data and opens up
    a port (90)
  • Portal provides an applet to connect to that port
    and control and display the rendering (25)

10
Development environment
  • Globus 2.4 for gsiftp servers
  • Tomcat 4.1.24 for portals and service wrappers
  • Globus 3.0 alpha 4 for grid services deployed
    within Tomcat
  • Sun J2SDK 1.4.1_03
  • Netscape 7.02 (Gecko/20030208)
  • All data and rendering centres are Linux
  • Tested clients include Linux, Windows and Mac OS X

11
Release 0 - June 6 2003
  • One hard-coded compressed FITS image in place of
    final data selection result
  • One hard-coded rendering cluster in place of
    final cluster selection result
  • Rendering cluster retrieves image from data
    centre via HTTP, decompresses it, converts it to
    volume rendering input format and stores it
    locally
  • Applet served from portal server, running in
    clients browser, successfully connects to
    rendering cluster and requests an image.

12
The future
  • Jias GridFTP client code to be incorporated next
    week - render cluster service complete!
  • Data registry and data centre grid service
    including selection to be ready in two weeks
  • Display and control applet to be largely
    completed over next four weeks.

13
Beyond the demo
  • Review demonstration in August
  • CSIRO ATNF group developing Java interface to
    legacy astronomy software
  • suitable long-term location of this project?
  • Conversion of Beowulf-class rendering tree to
    genuine distributed grid service for the
    piecewise rendering?
  • Integration with massive on-line parameterised
    databases?
Write a Comment
User Comments (0)
About PowerShow.com