Toward Relativistic Hydrodynamics on Adaptive Meshes - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Toward Relativistic Hydrodynamics on Adaptive Meshes

Description:

1. Toward Relativistic Hydrodynamics. on Adaptive Meshes ... Follow relativistic hydrodynamical flows on adaptive mesh. Accept evolving space-time metric ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 28
Provided by: joel189
Category:

less

Transcript and Presenter's Notes

Title: Toward Relativistic Hydrodynamics on Adaptive Meshes


1
Toward Relativistic Hydrodynamics on Adaptive
Meshes
  • Joel E. Tohline
  • Louisiana State University
  • http//www.phys.lsu.edu/tohline

2
Principal Collaborators
  • Simulations to be shown today
  • Shangli Ou (LSU)
  • Mario DSouza (LSU)
  • Michele Vallisneri (Caltech/JPL)
  • Code development over the years
  • John Woodward (Valtech Dallas, Texas)
  • John Cazes (Stennis Space Center Stennis,
    Mississippi)
  • Patrick Motl (Colorado)
  • Science
  • Juhan Frank (LSU)
  • Lee Lindblom (Caltech)
  • Luis Lehner (LSU)
  • Jorge Pullin (LSU)

3
Show 3 Movies
  • Nonlinear development of the r-mode in young
    neutron stars w/ Lindblom Vallisneri
    http//www.cacr.caltech.edu/projects/hydrligo/rmo
    de.html
  • Nonlinear development of the secular bar-mode
    instability in rapidly rotating neutron stars w/
    Ou Lindblom http//paris.phys.lsu.edu/ou/movie
    /fmode/new/fmode.b181.om4.2e5.mov
  • Mass-transferring binary star systems w/
    DSouza, Motl, Frank http//paris.phys.lsu.edu/
    mario/models/q_0.409_no_drag_3.8orbs/movies/q_0.4
    09_no_drag_3.8orbs_top.mov

4
Storyline
  • Present Algorithm has been producing
    publishable astrophysical results for over 20
    years
  • Entirely home-grown code outside of Cactus
    environment
  • Manual domain decomposition
  • Explicit message-passing using mpi
  • Visualizations on serial machines (generally,
    post-processing)
  • Plans for this calendar year
  • Move present algorithm into Cactus environment
  • Over the next few years, modify algorithm to
  • Follow relativistic hydrodynamical flows on
    adaptive mesh
  • Accept evolving space-time metric
  • Visualize results in parallel with dynamical
    evolution

5
Storyline
  • Present Algorithm has been producing
    publishable astrophysical results for over 20
    years
  • Entirely home-grown code outside of Cactus
    environment
  • Manual domain decomposition
  • Explicit message-passing using mpi
  • Visualizations on serial machines (generally,
    post-processing)
  • Plans for this calendar year
  • Move present algorithm into Cactus environment
  • Over the next few years, modify algorithm to
  • Follow relativistic hydrodynamical flows on
    adaptive mesh
  • Accept evolving space-time metric
  • Visualize results in parallel with dynamical
    evolution

6
Storyline
  • Present Algorithm has been producing
    publishable astrophysical results for over 20
    years
  • Entirely home-grown code outside of Cactus
    environment
  • Manual domain decomposition
  • Explicit message-passing using mpi
  • Visualizations on serial machines (generally,
    post-processing)
  • Plans for this calendar year
  • Move present algorithm into Cactus environment
  • Over the next few years, modify algorithm to
  • Follow relativistic hydrodynamical flows on
    adaptive mesh
  • Accept evolving space-time metric
  • Visualize results in parallel with dynamical
    evolution

7
Present Algorithm
  • Select grid structure and resolution
  • Construct initial configuration
  • Perform domain decomposition
  • While t lt tstop
  • Determine Newtonian gravitational accelerations
  • Push fluid around on the grid using Newtonian
    dynamics
  • If mod t , (orbital period/80) 0
  • Dump 3-D dataset for later visualization
  • EndIf
  • EndWhile
  • Visualize results

8
Principal Governing Equations
9
Principal Governing Equations
10
Present Algorithm
  • Select grid structure and resolution
  • Construct initial configuration
  • Perform domain decomposition
  • While t lt tstop
  • Determine Newtonian gravitational accelerations
  • Push fluid around on the grid using Newtonian
    dynamics
  • If mod t , (orbital period/80) 0
  • Dump 3-D dataset for later visualization
  • EndIf
  • EndWhile
  • Visualize results

11
Present Algorithm
  • Select grid structure and resolution
  • Construct initial configuration
  • Perform domain decomposition
  • While t lt tstop
  • Determine Newtonian gravitational accelerations
  • Push fluid around on the grid using Newtonian
    dynamics
  • If mod t , (orbital period/80) 0
  • Dump 3-D dataset for later visualization
  • EndIf
  • EndWhile
  • Visualize results

12
Present Algorithm
  • Select grid structure and resolution
  • Construct initial configuration
  • Perform domain decomposition
  • While t lt tstop
  • Determine Newtonian gravitational accelerations
  • Push fluid around on the grid using Newtonian
    dynamics
  • If mod t , (orbital period/80) 0
  • Dump 3-D dataset for later visualization
  • EndIf
  • EndWhile
  • Visualize results

Serial
Serial
13
Present Algorithm
  • Select grid structure and resolution
  • Construct initial configuration
  • Perform domain decomposition
  • While t lt tstop
  • Determine Newtonian gravitational accelerations
  • Push fluid around on the grid using Newtonian
    dynamics
  • If mod t , (orbital period/80) 0
  • Dump 3-D dataset for later visualization
  • EndIf
  • EndWhile
  • Visualize results

Parallel
14
Parallel Codes Chronological Evolution
  • Early 90s
  • Mid-90s
  • Late 90s
  • 2000
  • 2002-03
  • 2004
  • John Woodward
  • 8,192-processor MasPar _at_ LSU
  • John Cazes
  • CM5 _at_ NCSA T3D/E _at_ SDSC
  • Patrick Motl
  • mpi on T3E _at_ SDSC SP2/3 _at_ SDSC LSU
  • Michele Vallisneri
  • HP Exemplar _at_ CACR
  • Mario DSouza Shangli Ou
  • SuperMike (1,024-proc Linux cluster) _at_ LSU
  • Shangli Ou
  • Tungsten (2,560-proc Linux cluster) _at_ NCSA

15
Select Grid Structure and Resolution
  • Unigrid, cylindrical mesh
  • Fixed in time
  • Typical resolution
  • Single star 66 x 128 x 130 (as shown on the
    left)
  • Binary system 192 x 256 x 98

16
Select Grid Structure and Resolution
17
Need for Non-unigrid and Adaptive Meshes
18
Perform Domain Decomposition
  • Grid resolution 192 x 256 x 96
  • 64 processors
  • Distribute 192 x 96 (R,Z) grid across 8 x 8
    processor array
  • Leave angular zones stacked in memory
  • Result Each processor has data arrays of size
    24 x 256 x 12
  • I/O Scramble and unscramble handled manually

Z
R
19
Determine Newtonian Gravitational
Accelerations(Three-dimensional Elliptic PDE on
cylindrical mesh)
20
Principal Governing Equations
21
Determine Newtonian Gravitational
Accelerations(Three-dimensional Elliptic PDE on
cylindrical mesh)
  • Perform FFT (in memory) in azimuthal coordinate
    direction ? reduce to decoupled set of (256)
    two-dimensional Helmholtz equations.
  • Use ADI (alternating direction implicit) to solve
    each 2-D equation
  • Data transpose
  • 1-D, in-memory ADI sweep
  • Data transpose
  • 1-D, in-memory ADI sweep
  • Data transpose
  • Etc.
  • Inverse FFT

Z
R
22
Determine Newtonian Gravitational
Accelerations(Three-dimensional Elliptic PDE on
cylindrical mesh)
  • Perform FFT (in memory) in azimuthal coordinate
    direction ? reduce to decoupled set of (256)
    two-dimensional Helmholtz equations.
  • Use ADI (alternating direction implicit) to solve
    each 2-D equation
  • Data transpose
  • 1-D, in-memory ADI sweep
  • Data transpose
  • 1-D, in-memory ADI sweep
  • Data transpose
  • Etc.
  • Inverse FFT

Z
m
23
Determine Newtonian Gravitational
Accelerations(Three-dimensional Elliptic PDE on
cylindrical mesh)
  • Perform FFT (in memory) in azimuthal coordinate
    direction ? reduce to decoupled set of (256)
    two-dimensional Helmholtz equations.
  • Use ADI (alternating direction implicit) to solve
    each 2-D equation
  • Data transpose
  • 1-D, in-memory ADI sweep
  • Data transpose
  • 1-D, in-memory ADI sweep
  • Data transpose
  • Etc.
  • Inverse FFT

R
m
24
Visualize Results
  • Specify isodensity surface(s)
  • Find vertices and polygons on each surface (using
    marching cubes algorithm)
  • Write out vertices polygons in OBJ format
  • Delete 3-D dataset
  • Utilize Maya to render nested surfaces (from
    pre-specified viewer orientation)
  • Write out TIFF image (typically 640 x 480)
  • Generate .mov

25
Future Algorithm
  • Select grid structure and resolution and
    preferred AMR thorn
  • We Construct initial configuration
  • Let Cactus Perform domain decomposition
  • While t lt tstop
  • Call GR Groups thorn Determine structure of
    space-time metric
  • We (or Whisky thorn) Push fluid around on the
    grid using Relativistic dynamics
  • If mod t , (orbital period/80) 0
  • Generate vertices and polygons in parallel
  • Spawn Maya rendering task on additional
    processor(s)
  • EndIf
  • Call AMR thorn Modify mesh, as necessary
  • EndWhile
  • Use Cactus thorn Write article and Publish
    results

26
Future Algorithm
  • Select grid structure and resolution and
    preferred AMR thorn
  • We Construct initial configuration
  • Let Cactus Perform domain decomposition
  • While t lt tstop
  • Call GR Groups thorn Determine structure of
    space-time metric
  • We (or Whisky thorn) Push fluid around on the
    grid using Relativistic dynamics
  • If mod t , (orbital period/80) 0
  • Generate vertices and polygons in parallel
  • Spawn Maya rendering task on additional
    processor(s)
  • EndIf
  • Call AMR thorn Modify mesh, as necessary
  • EndWhile
  • Use Cactus thorn Write article and Publish
    results

27
THE END
Write a Comment
User Comments (0)
About PowerShow.com