Title: Integrating high performance computing with data infrastructure using the GEON grid
1A community modeling environment geodynamic
integration of multi-scale geoscience data
Mian Liu1, Huai Zhang1,2, Youqing Yang1,
Qingsong Li1, Yaolin Shi2 1-University of
Missouri-Columbia 2-Computational Geodynamic Lab,
CAS, China
2Motivation 1
- Exponentially increase of multi-scale
observational data that need to be integrated and
interpreted within a self-consistent geodynamic
framework
3EarthScope Instruments
4EarthScope Annual Data Volume
- Data volumes over next 10 years
GPS 7.7 TB BSM/LSM 10.5 TB Seismic 120
TB
5Multi-timescales of geoscience data
6Motivation 2
- Advance of computer hardware (especially PC
clusters and grid computers) and software
engineering have provided unprecedented computing
power - Data infrastructure have made integrating
multiscale data both easy and necessary.
7So we built the data cyberinfrastructures, now
what?
HPCC
Physical model
Internet
GEON
Data Grid
8Free scientists from coding to do science, or
whatever they do best
HPCC
Physical model
Internet
Data Grid
9Some current efforts on geodynamic computations
- Earth Simulator-GeoFEM project
- Geoframwork
- QuakeSim
- SCEC Community Modeling Environment
- CIG (Computational Infrastructure for Geodynamics
)
10More than one way to do it
- Develop specific type of models (e.g., mantle
convection) - Use plug-in modules in a general system to
generate specific type of models (wave, fluid,
structure, etc.)
11Example Earth Simulator- GeoFEM project
http//geofem.tokyo.rist.or.jp
Multi-Purpose/Multi-physics Parallel FE
Simulator/Platform for Solid Earth
Different finite element model can be plugged
into this system
12Wont it be nice if we can have a general,
flexible community modeling system?
- Not all geological needs can fit into the
pigeonholes - Need integration with data CI
- Scalable for parallel and grid computation
Wouldnt it be nice if all (or most of) these
can be automated? ?
13Examples of commercial FE code generation systems
- PED2D (http//members.aol.com/pde2d)
- FEPG (Finite Element Program Generator)
(http//www.fegensoft.com/english/index.htm)
14The devil is in the details
PDE2FEM system
Dynamic load-balancing of each node in parallel
computer
LMDDM and LMDDA algorithm kernel subroutines
GES,PDE,CDE, SDE etc. element subroutine
generators
GCN, NFE etc. nonlinear algorithms generators
Libs for PDEs, shape functions and other
software packages
Theoretical and application documents for users
15Automated Code Generator Step 1 From PDE
expression to Fortran segments
disp u v coor x y func funa funb func shap 1
2 gaus 3 mass 1 load fu fv c6 pe
prmt(1) c6 pv prmt(2) c6 fu prmt(3) c6 fv
prmt(4) c6 fact pe/(1.pv)/(1.-2.pv) func fu
nau/x funbv/y funcu/yv/x stif dist
funafunafact(1.-pv) funafunbfact(pv)
funbfunafact(pv) funbfunbfact(1.-pv)
funcfuncfact(0.5-pv)
Segment 1
es,em,ef,Estifn,Estifv,
variables
Segment 2
es(k,k),em(k),ef(k),Estifn(k,k),Estifv(kk),
goto (1,2), ityp 1 call seuq4g2(r,coef,prmt,e
s,em,ec,ef,ne) goto 3 2 call
seugl2g2(r,coef,prmt,es,em,ec,ef,ne) goto
3 3 continue
Segment 3
DO J1,NMATE PRMT(J) EMATE((IMATE-1)NMATEJ) En
d do PRMT(NMATE1)TIME PRMT(NMATE2)DT prmt(nmat
e3)imate prmt(nmate4)num
equation
Segment 4
Other element matrix computing Subs
PDE expression Contains information of the
physical model, such as variables and equations
for generating element stiffness matrix.
Fortran Segments codes that realize the physical
model at element level.
16Step 2 From algorithm expression to Fortran
segments
defi stif S mass M load F type e mdty l step
0 equation matrix S FORCF SOLUTION
U write(s,unod) U end
do i1,k do j1,k estifn(i,j)0.0 end do end
do do i1,k estifn(i,i)estifn(i,i) do j1,k
estifn(i,j)estifn(i,j)es(i,j) end do end do
Stiffness matrix
Segment 5
U(IDGF,NODI)U(IDGF,NODI) ef(i)
Segment 6
Algorithm Expression Contains information for
forming global stiffness matrix for the model.
Fortran Segments codes that realize the physical
model at global level.
17Step 3 Plug Fortran segments into a stencil,
forming final FE program
Program Stencil
Fortran Segments generated
SUBROUTINE ETSUB(KNODE,KDGOF,IT,KCOOR,KELEM,K,KK,
NUMEL,ITYP,NCOOR,NUM,TIME,DT,NODVAR,COOR,NODE,
SUBET.sub U) implicit double
precision (a-h,o-z) DIMENSION
NODVAR(KDGOF,KNODE),COOR(KCOOR,KNODE),
U(KDGOF,KNODE),EMATE(300), SUBDIM.sub
R(500),PRMT(500),COEF(500),LM(500) SUBFORT.sub
ELEM.sub C WRITE(,) 'ES EM EF ' C
WRITE(,18) (EF(I),I1,K) MATRIX.sub L0
M0 I0 DO 700 INOD1,NNE
U(IDGF,NODI)U(IDGF,NODI) LVL.sub
DO 500 JNOD1,NNE 500 CONTINUE 700
CONTINUE return end
Segment 1
Segment 2
Segment 4
Segment 3
Segment 5
Segment 6
..
18Examples
- Western US tectonics
- Deformation of Asian continent
- Stress evolution and strain localization in the
San Andreas Fault
19A Preliminary Finite Element Model of Active
Crustal Deformation in the Western US
20(No Transcript)
21The Power of GEON Cluster Node
Original series model (single CPU)
Preliminary parallel model (16-nodes, 32 CPUs)
(x 40vertical topographic exaggeration)
- More than 800,000 unstructured elements
- Major Faults and more deformation zones
- Subduction of Juan de Fuca slab
- 21 layers in R-direction
- Less than 3000 elements
- Three layers in R-direction
- 2 min for per time step
22Automatic domain decomposition for parallel
computing
23(No Transcript)
24The model now allows simulation of large scale
continental deformation with unprecedented detail
25The model now allows simulation of large scale
continental deformation with unprecedented detail
26Ongoing Effort Toward a new 3D model of
continental deformation in Asia
27Predicted vertical velocity
28Predicted surface shear stress
29Loading the San Andreas Fault by relative PA-NA
motion
- Fully 3D
- Dynamic
- Plastic-viscoelastic
- Co-seismic/interseismic cycles
- from seconds to 104 years
- Parallel computing on PC clusters
30Comparison of predicted surface velocity and GPS
data
31Predicted maximum shear stress
32(No Transcript)
33Predicted rate of plastic strain energy release
outside the SAF
34Dream on
- Integrating the community modeling environment
with the geoscience data cyberinfrastructure - Grid computation and data integration
- Automated (optimized?) work flow management (the
Kepler system?)
35Data gt???
Physical model
PDEs
FEM Modeling Language
func funau/x funfu/yv/x dist
funafunad(1,1)funafunbd(1,2)funafunc
d(1,3) funbfunad(2,1)funbfunbd(2,2)f
unbfuncd(2,3) funcfunad(3,1)funcfunbd
(3,2)funcfuncd(3,3) fundfundd(4,4)fune
funed(5,5)funffunfd(6,6) load
ufuvfvwfw-funaf(1)-funbf(2)-fun
cf(3) -fundf(4)-funef(5)-funff(6)
Data Grid (GEON and others)
Automatic source code generator
Model results
HPCC
36Thank you!