Simulating Cosmological Radiative Transfer - PowerPoint PPT Presentation

1 / 1
About This Presentation
Title:

Simulating Cosmological Radiative Transfer

Description:

The hybrid characteristics model is, of course, not perfect. ... the Millennium simulation at z = 5.7 (Millennium was run by the Virgo Consortium) ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 2
Provided by: milanra
Category:

less

Transcript and Presenter's Notes

Title: Simulating Cosmological Radiative Transfer


1
Simulating Cosmological Radiative Transfer (or
How to Light Up The Universe)
Milan Raicevic Institute for computational
cosmology, Durham University
Introduction Behind the term radiative
transfer hides a theory that mathematically
describes the propagation of light through
matter. The development of radiative transfer is
historically linked to the emergence of
astrophysics in the first quarter of the 20th
century, mainly the investigation of stellar
evolution and structure. Indeed, it would be hard
to imagine stars without starlight (that was
before neutron stars and black holes).
Modern theoretical astrophysics regularly
utilizes computer simulations as a 'laboratory'.
These simulations tend to need as much
computational resources as they can get so the
use of parallel processor supercomputers is a
standard. Cosmology is especially demanding -
number of particles in the biggest structure
formation simulations is already reaching
billions. Because of these high costs and
some other more specific problems, the role of
light in the formation simulations of of today's
universe was largely ignored, but recent
observational advances have put forth an
immediate need for more complete simulations.
That is why a number of new methods for numerical
radiative transfer have been developed in recent
years. Our work consists of developing one such
method and applying it to the current burning
issues in modern cosmology.
A bit of theory The theory of radiative
transfer is not a simple matter at all but,
fortunately for cosmologists, desired results can
be obtained even with the use of a number of
simplifying approximations. In our model we
consider a point source of light, which is strong
enough for us to ignore diffusion. We follow the
propagation of photons by casting a number of
rays and calculating how the intensity of light
changes along these rays where I(r) is
the intensity at distance r, I(0) the intensity
at the source and t is the optical depth
where a0 is the cross-section and N(r) is the
column density (the density 'along' the ray).
Hybrid Characteristics Hybrid
characteristics (Rijkhorst et al. 2006) is a
rather new method for transporting radiation,
made for use on parallelized adaptive mesh
refinement codes (AMR). In essence, it is a
ray-tracing algorithm that combines two
ray-tracing schemes widely used in astrophysics
the long and short characteristics. The most
natural approach for ray-tracing is to draw a ray
from the source of light to the cell which is
being considered (picture 1). That method is
called long characteristics. Unfortunately, as
can be seen from the picture, long
characteristics are quite computationally
intensive, especially in the cells near the
source of the rays where there are a lot of
unnecessary calculations. To remedy this
problem, the method of short characteristics was
introduced. As seen on picture 2., instead of
calculating contributions for every ray crossing
a cell, we only calculate the values to the
corners of the cells and from them interpolate
the values for every individual ray. This method
is much more computationally effective but it
introduces new problems first, it introduces a
certain amount of numerical diffusion and second,
since the ray is constructed through
interpolation cell by cell going away from the
source, it is very hard if not impossible to
parallelize. Hybrid characteristics borrows
from both of these approaches and combines them
to make ray-tracing effective on a parallelized
system. In short, on every part of the problem
box belonging to a single processor, that part of
the ray is being calculated with long
characteristics, but the communication between
different computational domains borrows the
interpolation idea only the values of the
column density at the corners of the cells on the
edge of every domain are communicated between
processors. The final ray is constructed through
summing up the contributions of every processor
'cut' by the ray.
Picture 1. long characteristics method
Picture 2. short characteristics method
Extending the method The hybrid
characteristics model is, of course, not perfect.
One of its most serious problems is the lack of
the, so called, photon conservation. That is a
large issue in the whole radiative transfer
field. In a nutshell, it is an error arising from
numerical discretization, both of space and time.
The effect is that the results for the positions
of ionization fronts, the area of space that has
been ionized up to a certain time, are not robust
in regards to changes in grid resolution or time
steps. It is not a big issue in every problem,
but in cosmology, we need the speed of the fronts
to be as accurate as possible and that is why we
are working to correct the HC method. The
issue is not trivial and requires more room to
explain than we have here. To demonstrate the
effects of photon conservation, we have included
picture 5.
Picture 3. Interpolation scheme in Hybrid
characteristics
Hydrodynamical Adaptive Mesh Refinement method
In the Eulerian hydrodynamical scheme, the
fluid is described by a grid of cells, where
every cell represents a part of the observed area
with the one value of variables such as density,
pressure, temperature and so on. To precisely
represent the physical world, that grid should be
infinitely fine, but in practice, that is
obviously impossible. We are limited by our
computing power so we always have a finite sized
cells of the grid. Yet certain problems need to
have a very fine resolution in order to resolve
the interesting structures and phenomena. That is
where AMR steps in. The main idea of this method
is to have a different cell size in certain parts
of the grid, meaning to increase the resolution
around interesting phenomena and to decrease it
in the other parts. Example of a partially
resolved grid divided between two processors can
be seen on picture 4. and AMR in action can be
seen in a picture 8. at the bottom right of the
poster. In our work, we use a publicly
available code, FLASH, developed in the Alliance
Center For Astrophysical Thermonuclear flashes at
the University of Chicago. FLASH is a fully
parallel AMR code well suited for many different
hydrodynamical problems. It is written in Fortran
90 and uses its modular structure to the maximum
which makes additions to the code to be quite
simple and straightforward.
Picture 5. Effects of photon conservation on
position of the ionization front
Where to go from here As mentioned at the
beginning, radiative transfer is essential in a
large number of astrophysical problems. What we
are interested in, are its applications in
cosmology. In the general picture of structure
formation, radiation comes in rather late
obviously not before the formation of the first
sources of light. Still, once those sources are
formed radiative transfer becomes indispensable
for understanding the further evolution of the
universe. Thanks to the quasar absorption
lines data, it is a well known fact that the
universe was mostly neutral up to a certain point
and then it was rapidly ionized and got to a
state of very high ionization in which we observe
it today. That period is called reionization and
it is closely linked to the emergence of the
first light sources in the universe. Yet, a part
from a rough estimate of when reionization
happened, not much more is known. There are still
a lot of open questions, such as what exactly
were the sources of reionization quasars or the
first stars formed in early galaxies? What
effects did it have on the formation of
structures in the universe, a process well
described by only gravity up to the point of
reionization. The interest in
computational approach to these problems has been
ignited in not more than the last 5 to 10 years,
first because of the rise in available computing
power needed to do radiative transfer along with
gravity and hydrodynamics. Second, a number of
new observational experiments, such as LOFAR that
will measure the 21cm neutral hydrogen line to
very high redshifts, promise to give us
unprecedented insight in this period in the
history of our universe.
Picture 4. an AMR grid distributed between two
processors
The next step As a first project we have
chosen to look at the evaporation of mini haloes
during reionization. In theory, first light can
inject enough heat into smaller dark matter
haloes, effectively stopping them from forming
galaxies. That effect could explain the
discrepancy between the observed cutoff in the
lower end of the galaxy distribution (the so
called luminosity function) and the cold dark
matter gravity-only simulations that predict a
much larger number of small galaxies than it is
being observed. Simulating this problem
needs a code that is capable of handling gravity,
hydrodynamics and radiative transfer all together
which is what our code is made to do. We are
hoping for the first runs to be complete by the
end of this summer.
Picture 6. Expansion of ionized bubbles during
reionization. Blue regions are ionized, while
green are neutral (Mellema et al. 2006)
Picture 7. Position-redshift slice that
represents the large scale geometry of
reionization as it should be seen by future 21-cm
surveys such as LOFAR (Mellema et al. 2006)
Picture 8. Density plot of evaporation of a dense
spherical clump by a point source. Its an
idealized version of mini halo evaporation
Background a snapshot of the Millennium
simulation at z 5.7 (Millennium was run by the
Virgo Consortium)
Write a Comment
User Comments (0)
About PowerShow.com