Integrating high performance computing with data infrastructure using the GEON grid - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Integrating high performance computing with data infrastructure using the GEON grid

Description:

Example: Earth Simulator- GeoFEM project. www.geongrid.org ... nonlinear algorithms generators. Libs for PDEs, shape functions. and other software packages ... – PowerPoint PPT presentation

Number of Views:105
Avg rating:3.0/5.0
Slides: 36
Provided by: Mian
Category:

less

Transcript and Presenter's Notes

Title: Integrating high performance computing with data infrastructure using the GEON grid


1
A community modeling environment geodynamic
integration of multi-scale geoscience data
Mian Liu1, Huai Zhang1,2, Youqing Yang1,
Qingsong Li1, Yaolin Shi2 1-University of
Missouri-Columbia 2-Computational Geodynamic Lab,
CAS, China
2
Motivation 1
  • Exponentially increase of multi-scale
    observational data that need to be integrated and
    interpreted within a self-consistent geodynamic
    framework

3
EarthScope Instruments
4
EarthScope Annual Data Volume
  • Data volumes over next 10 years

GPS 7.7 TB BSM/LSM 10.5 TB Seismic 120
TB
5
Multi-timescales of geoscience data
6
Motivation 2
  • Advance of computer hardware (especially PC
    clusters and grid computers) and software
    engineering have provided unprecedented computing
    power
  • Data infrastructure have made integrating
    multiscale data both easy and necessary.

7
So we built the data cyberinfrastructures, now
what?
HPCC
Physical model
Internet
GEON
Data Grid
8
Free scientists from coding to do science, or
whatever they do best
HPCC
Physical model
Internet
Data Grid
9
Some current efforts on geodynamic computations
  • Earth Simulator-GeoFEM project
  • Geoframwork
  • QuakeSim
  • SCEC Community Modeling Environment
  • CIG (Computational Infrastructure for Geodynamics
    )

10
More than one way to do it
  • Develop specific type of models (e.g., mantle
    convection)
  • Use plug-in modules in a general system to
    generate specific type of models (wave, fluid,
    structure, etc.)

11
Example Earth Simulator- GeoFEM project
http//geofem.tokyo.rist.or.jp
Multi-Purpose/Multi-physics Parallel FE
Simulator/Platform for Solid Earth
Different finite element model can be plugged
into this system
12
Wont it be nice if we can have a general,
flexible community modeling system?
  • Not all geological needs can fit into the
    pigeonholes
  • Need integration with data CI
  • Scalable for parallel and grid computation

Wouldnt it be nice if all (or most of) these
can be automated? ?
13
Examples of commercial FE code generation systems
  • PED2D (http//members.aol.com/pde2d)
  • FEPG (Finite Element Program Generator)
    (http//www.fegensoft.com/english/index.htm)

14
The devil is in the details
PDE2FEM system
Dynamic load-balancing of each node in parallel
computer
LMDDM and LMDDA algorithm kernel subroutines
GES,PDE,CDE, SDE etc. element subroutine
generators
GCN, NFE etc. nonlinear algorithms generators
Libs for PDEs, shape functions and other
software packages
Theoretical and application documents for users
15
Automated Code Generator Step 1 From PDE
expression to Fortran segments
disp u v coor x y func funa funb func shap 1
2 gaus 3 mass 1 load fu fv c6 pe
prmt(1) c6 pv prmt(2) c6 fu prmt(3) c6 fv
prmt(4) c6 fact pe/(1.pv)/(1.-2.pv) func fu
nau/x funbv/y funcu/yv/x stif dist
funafunafact(1.-pv) funafunbfact(pv)
funbfunafact(pv) funbfunbfact(1.-pv)
funcfuncfact(0.5-pv)
Segment 1
es,em,ef,Estifn,Estifv,
variables
Segment 2
es(k,k),em(k),ef(k),Estifn(k,k),Estifv(kk),
goto (1,2), ityp 1 call seuq4g2(r,coef,prmt,e
s,em,ec,ef,ne) goto 3 2 call
seugl2g2(r,coef,prmt,es,em,ec,ef,ne) goto
3 3 continue
Segment 3
DO J1,NMATE PRMT(J) EMATE((IMATE-1)NMATEJ) En
d do PRMT(NMATE1)TIME PRMT(NMATE2)DT prmt(nmat
e3)imate prmt(nmate4)num
equation
Segment 4
Other element matrix computing Subs
PDE expression Contains information of the
physical model, such as variables and equations
for generating element stiffness matrix.
Fortran Segments codes that realize the physical
model at element level.
16
Step 2 From algorithm expression to Fortran
segments
defi stif S mass M load F type e mdty l step
0 equation matrix S FORCF SOLUTION
U write(s,unod) U end
do i1,k do j1,k estifn(i,j)0.0 end do end
do do i1,k estifn(i,i)estifn(i,i) do j1,k
estifn(i,j)estifn(i,j)es(i,j) end do end do
Stiffness matrix
Segment 5
U(IDGF,NODI)U(IDGF,NODI) ef(i)
Segment 6
Algorithm Expression Contains information for
forming global stiffness matrix for the model.
Fortran Segments codes that realize the physical
model at global level.
17
Step 3 Plug Fortran segments into a stencil,
forming final FE program
Program Stencil
Fortran Segments generated
SUBROUTINE ETSUB(KNODE,KDGOF,IT,KCOOR,KELEM,K,KK,
NUMEL,ITYP,NCOOR,NUM,TIME,DT,NODVAR,COOR,NODE,
SUBET.sub U) implicit double
precision (a-h,o-z) DIMENSION
NODVAR(KDGOF,KNODE),COOR(KCOOR,KNODE),
U(KDGOF,KNODE),EMATE(300), SUBDIM.sub
R(500),PRMT(500),COEF(500),LM(500) SUBFORT.sub
ELEM.sub C WRITE(,) 'ES EM EF ' C
WRITE(,18) (EF(I),I1,K) MATRIX.sub L0
M0 I0 DO 700 INOD1,NNE
U(IDGF,NODI)U(IDGF,NODI) LVL.sub
DO 500 JNOD1,NNE 500 CONTINUE 700
CONTINUE return end
Segment 1
Segment 2
Segment 4
Segment 3
Segment 5
Segment 6
..
18
Examples
  • Western US tectonics
  • Deformation of Asian continent
  • Stress evolution and strain localization in the
    San Andreas Fault

19
A Preliminary Finite Element Model of Active
Crustal Deformation in the Western US
20
(No Transcript)
21
The Power of GEON Cluster Node
Original series model (single CPU)
Preliminary parallel model (16-nodes, 32 CPUs)
(x 40vertical topographic exaggeration)
  • More than 800,000 unstructured elements
  • Major Faults and more deformation zones
  • Subduction of Juan de Fuca slab
  • 21 layers in R-direction
  • Less than 3000 elements
  • Three layers in R-direction
  • 2 min for per time step

22
Automatic domain decomposition for parallel
computing
23
(No Transcript)
24
The model now allows simulation of large scale
continental deformation with unprecedented detail
25
The model now allows simulation of large scale
continental deformation with unprecedented detail
26
Ongoing Effort Toward a new 3D model of
continental deformation in Asia
27
Predicted vertical velocity
28
Predicted surface shear stress
29
Loading the San Andreas Fault by relative PA-NA
motion
  • Fully 3D
  • Dynamic
  • Plastic-viscoelastic
  • Co-seismic/interseismic cycles
  • from seconds to 104 years
  • Parallel computing on PC clusters

30
Comparison of predicted surface velocity and GPS
data
31
Predicted maximum shear stress
32
(No Transcript)
33
Predicted rate of plastic strain energy release
outside the SAF
34
Dream on
  • Integrating the community modeling environment
    with the geoscience data cyberinfrastructure
  • Grid computation and data integration
  • Automated (optimized?) work flow management (the
    Kepler system?)

35
Data gt???
Physical model
PDEs
FEM Modeling Language
func funau/x funfu/yv/x dist
funafunad(1,1)funafunbd(1,2)funafunc
d(1,3) funbfunad(2,1)funbfunbd(2,2)f
unbfuncd(2,3) funcfunad(3,1)funcfunbd
(3,2)funcfuncd(3,3) fundfundd(4,4)fune
funed(5,5)funffunfd(6,6) load
ufuvfvwfw-funaf(1)-funbf(2)-fun
cf(3) -fundf(4)-funef(5)-funff(6)
Data Grid (GEON and others)
Automatic source code generator
Model results
HPCC
36
Thank you!
Write a Comment
User Comments (0)
About PowerShow.com