Title: CactusTIKSLKDIPortal Synch Day
1Cactus/TIKSL/KDI/Portal Synch Day
2Agenda
- Main Goals
- Overview of Cactus, TIKSL, KDI, and Portal
efforts - present plans for each project
- make sure everyone knows what is going on
- coordinate efforts to extent possible
- Present/develop detailed plan for Portal
- where we are now
- who will do what in future
- develop coherence with NCSA, other portal
efforts - get clear plan for future development, feedback
to Globus, NCSA - get testbed set up
- get user input asap!
- Break up into working groups, summarize plans
- Overall Top Question to Answer How do we
integrate and develop limited production testbed
asap? Want something in place by end of August!
3Agenda, cont.
- Questions to be asking and answering in your
presentations - What is the overview and status of your project
(save gory details for working groups)? - How do we best integrate the efforts?
- When will certain features be done?
- What support/features do you want from other
groups (NCSA, Portal, Globus, TIKSL, HDF, Cactus,
etc.) - What to do about lunch?
- Look at rest of agenda...
4Big Picture Project Space
Users
Portal
Developers
AEI/NCSA/WashU
Cactus
TIKSL AEI-ZIB-Garching
KDI
Globus
EU Network
Grads
NCSA
Grid Forum
Zeus
Egrid
Numerical Rel
General User Community
5What is Cactus?
- Its not just for breakfast anymore...
- It is not a relativity application(but it can do
that) - It is not an astrophysics application(but it can
do that) - It is not a fluid dynamics application(but it
can do that) - It is a metacode framework for parallel
applications, with - Pluggable data distribution layers (generic MPI,
others) - Pluggable parallel I/O
- Pluggable performance monitoring tools (PAPI,
Autopilot, etc) - Pluggable engineering/scientific applications,
linear solvers, etc - Pluggable cool stuff (remote steering, monitoring
tools) - Etc...
- Cactus Globus apps plugged into Cactus can
become Grid-enabled - A portal under development main topic here...
6A Portal to Computational Science The Cactus
Collaboratory
1. User has science idea...
2. Composes/Builds Code Components w/Interface...
3. Selects Appropriate Resources...
4. Steers simulation, monitors performance...
5. Collaborators log in to monitor...
Want to integrate and migrate this technology to
the generic user...
7Portal Components
8Cactus Portal
- Has Generic and Cactus-specific parts build on
generic interfaces, which should be enhanced for
additional app info - Cactus specific
- Code composition (Cactus can be what you want it
to be) - Configuration Analysis (What the hell is in this
directory?) - Parameter Setting
- Interfaces must be self configuring
- Generic ( Cactus specific bonus features)
- Manual Resource Selection
- 1. Which machine? User selects based on
available resources - How will user know loads, wait times, resources?
Need to have some standard interface to provide
this info... - 2. Which machines? User wants 20GF, 20GB
memory. Could get 64 procs at NCSA and 64 at
SDSC... - Added cactus bonus what resources are
compatible or recommended with my special
configuration - Automatic Resource Selection
- Just direct job to appropriate resources given
request.
9Cactus Portal, cont...
- Job Launching
- Once resources selected, start job, handle batch,
job submission, compilation if required - Take care of file storage, archiving
- Job Monitoring
- Generic monitoring queues through common
interface, notification of completion of job - What is that interface?
- What about distributed simulations across
sites??? - Cactus specific
- Web server interface (thorn http)
- All active routines in running simulation
displayed - All parameters for those routines displayed,
steerable parameters can be changed - Crude visualization of running simulation through
browser interface - Sophisticated Remote Visualization
- Retrieval of arbitrary data through streaming
HDF5 for local visualization - 1D, 2D, 3D, downsampled, depending on bandwidth
available - Inline visualization (e.g., isosurfaces,
streamlines) sent over network
10Cactus Portal, continued
- Performance monitoring Want generic interface
to warn user when perf is poor (usually is, and
user does not even know!!!) - PAPI (single proc, color coded for routine)
- Autopilot
- What else should be provided? What is envisioned
for generic portal? - Steering
- Science
- User changes parameters based on what is observed
- Parameters screwed up abort or keep going?
- Forgot to turn on output of favorite variable
- Forgot to turn on some routine
- Too much output, disc filling up
- Scientific results lead to change in algorithm or
resource request - AMR
- Feature indicates some change beneficial
- Logging of all changes
11Cactus Portal, continued
- Performance steering
- How is my job doing?
- Network bandwidth OK?
- Suggest other architecture
- Suggest algorithm changes due to current state of
performance - E.g., Extra ghost zones
- Questions for discussion
- What is really Cactus specific?
- What is generic what will standard portal, VMR,
etc, provide? - How to get maximum overlap between these efforts?
- How to get active testbed established asap, get
appropriate users, portal developers, Globus
developers, Grads developers effectively working
together - How long before this can brought into production?
- Should at least have SC2000 demo!
12Grand Picture
Viz of data from previous simulations in SF café
Remote steering and monitoring from airport
Remote Viz in St Louis
Remote Viz and steering from Berlin
DataGrid/DPSS Downsampling
IsoSurfaces
http
HDF5
T3E Garching
Origin NCSA
Globus
Simulations launched from Cactus Portal
Grid enabled Cactus runs on distributed machines
13Further details...
- Cactus
- http//www.cactuscode.org
- http//www.computer.org/computer/articles/einstein
_1299_1.htm - Movies, research overview (needs major updating)
- http//jean-luc.ncsa.uiuc.edu
- Simulation Collaboratory/Portal Work
- http//wugrav.wustl.edu/ASC/mainFrame.html
- Remote Steering, high speed networking
- http//www.zib.de/Visual/projects/TIKSL/
- http//jean-luc.ncsa.uiuc.edu/Projects/Gigabit/
- EU Astrophysics Network
- http//www.aei-potsdam.mpg.de/research/astro/eu_ne
twork/index.html