A Framework for Reuse and Parallelization of LargeScale Scientific Simulation Code - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

A Framework for Reuse and Parallelization of LargeScale Scientific Simulation Code

Description:

Laser Ablation. The laser deposits the bulk of its energy in the skin depth region of the target ... We started with sequential simulation code for Laser Ablation ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 29
Provided by: FredH2
Category:

less

Transcript and Presenter's Notes

Title: A Framework for Reuse and Parallelization of LargeScale Scientific Simulation Code


1
(No Transcript)
2
A Framework for Reuse and Parallelization of
Large-Scale Scientific Simulation Code
  • Manolo E. Sherrill, Roberto C. Mancini, Frederick
    C. Harris, Jr, and Sergiu M. Dascalu
  • University of Nevada, Reno

3
Introduction
  • Software design in the scientific community often
    fails due to the nature and lifespan of the code
    being developed.
  • Today we want to present a framework for reuse
    and parallelization of scientific simulation
    code.
  • Both legacy code and new code

4
Introduction
  • What causes the difficulty?
  • New Code
  • Algorithms were implemented just to test if they
    would work NOT to be used in production.
  • BUT if it works.
  • It gets used and becomes Legacy Code

5
Introduction
  • Once you reach this stage it becomes difficult to
    integrate these code pieces into larger
    simulations due to
  • name conflicts,
  • poor organization
  • lack of consistent styles

6
Motivation
  • Laser Ablation experiments at the Nevada Terawatt
    Facility

7
Motivation
  • Laser Ablation fits the description just
    provided
  • Pieces of code that have been written once
  • Then used over and over again
  • And Modified several times over that lifetime.
  • This modification comes as new data comes from
    experimentalists.

8
Laser Ablation
  • What is Laser Ablation?
  • It is the process of ablating material from a
    solid or liquid target
  • Typically with a low intensity laser
  • 1x107 W/cm2 to 1x1010 W/cm2

9
Laser Ablation
  • The laser deposits the bulk of its energy in the
    skin depth region of the target
  • The material is heated, then undergoes melting,
    evaporation, and possible plasma formation

10
Laser Ablation
  • The material in the gaseous state forms a plume
    that moves away from the target
  • The expansion proceeds at of a few 10 mm/nsec

11
Laser Ablation
12
Laser Ablation
  • Laser ablation is commonly used
  • In experimental physics as ion sources
  • In industry to generate thin films
  • Recently more exotic systems have entered this
    arena
  • Pico- and Femto-second lasers
  • The shorter pulses allow you to ablate a thinner
    layer thereby getting a thinner film

13
Framework Development
  • We started with sequential simulation code for
    Laser Ablation
  • We were moving to multi-element, multi-spatial
    zone simulations
  • Initial execution timings were disappointing,
  • No, they were unacceptable.
  • Therefore a parallel implementation was pursued.
  • But major issues stopped us (as described
    previously)

14
Framework Paradigm
  • Operating Systems
  • Monolithic
  • Micro-Kernel
  • Monolithic Kernels fight the same issues we have
    discussed in scientific code.it gets so hard to
    make fixes and additions without introducing more
    errors.

15
Framework Paradigm
  • Micro-Kernels separate functionality into
    separate processes (as opposed to layers)
  • Inter-process communication happens through
    message passing via consistent interfaces.
  • The micro-kernel acts as a centralized point of
    connection and communication.

16
Framework Paradigm
  • Micro-Kernels have issues and benefits
  • Can be slower due to message passing
  • Can use RAM more efficiently due to separate
    processes

17
Implementation Overview
  • Though the low-level message passing used in
    micro-kernels is not appropriate for scientific
    programming
  • It did move us to look at message passing which
    leads us back to parallel programming.

18
Implementation Overview
  • We used PVM as our parallel library to
    accommodate our implementation framework
  • Our framework
  • acts as a set of utilities
  • transfers data
  • manages data structures for accessing processes
    on local and remote nodes
  • as well as coordinating I/O

19
Framework Instantiation
  • It is important to note that except for the
    amount of data transferred between processes and
    for the topology of the implementation, the
    modules are independent of the physics code
    embedded in them.
  • It does allow us to separate the computation from
    the communication, and thereby isolate the legacy
    code making code development easier.

20
Framework Instantiation
  • The legacy code is represented by the circles.
  • The triangles are the framework around them that
    handles the message passing.
  • Left pointing are children
  • Right pointing are parents and spawn processes

21
Framework Instantiation
  • Once a self-consistent solution is found, data
    from the SEKM layer is forwarded to the Radiation
    Transport Module.
  • Here, the data from each zone is combined into a
    complete synthetic spectra and then compared.

22
Framework Instantiation
  • The comparison of synthetic data to the
    experimental data is done in an automated manner
    with a search engine
  • Thousands of sample data (temperature and
    density) are stored in a Parallel Q which spawns
    Layer II members

23
Framework Instantiation
  • 4 Zone synthetic spectra compared with
    experimental data

24
Framework Instantiation
  • 6 Zone synthetic spectra compared with
    experimental data

25
Conclusions
  • We have presented the motivation for a change in
    software architecture in the physics community.
  • This change has resulted in easier maintenance
    and increased performance of simulation codes
  • It has allowed protection and encapsulation of
    existing legacy code as well as allowing
    parallelization of the same code

26
Conclusions
  • This framework is also beneficial for a variety
    of reasons
  • New experimental data just requires changes to a
    single module
  • Physics components can be tested and modified
    individually before adding them to a larger
    simulation. (even ones not planned on)
  • It also allows us to change the topology and
    bring in parallel processing quite easily
  • It keeps legacy codes separate, thereby providing
    some protection

27
Future Work
  • Applying this to other Physics codes.
  • Thereby effectively re-using legacy codes,
    particularly on larger machines
  • Integration of others code models without having
    the source.
  • BIG issue in the Physics community.

28
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com