Title: The PRISM infrastructure
1The PRISM infrastructure for Earth system models
Eric Guilyardi, CGAM/IPSL and the PRISM Team
- Background and drivers
- PRISM project achievements
- The future
2Why a common software infrastructure ?
- Earth system modelling expertise widely
distributed
Geographically
3Why a common software infrastructure ?
- Earth system modelling expertise widely
distributed
- Scientific motivation facilitate sharing of
scientific expertise and of models
- Technical motivation the technical challenges
are large compared with available effort
- Need to keep scientific diversity while
increasing efficiency scientific and technical - Need for concerted effort in view of initiatives
elsewhere - The Frontier Project, Japan
- The Earth System Modelling Framework, US
4PRISM concept
- Â Share Earth System Modelling software
infrastructure - across communityÂ
- To
- share development, maintenance and support
- aid performance on a variety of platforms
- standardize model software environment
- ease use of different climate model components
5Expected benefits
- high performance ESM software, developed by
dedicated IT experts, available to
institutes/teams at low cost - - helps scientists to focus on science
- - helps key scientific diversity (survival of
smallers groups)
- Easier to assemble ESMs based on community models
- shared infrastructure increased scientific
exchanges
- computer manufacturers inclined to contribute
- - efficiency (porting, optimisation) on variety
of platforms - - next generation platforms optimized for ESM
needs - - easier procurements and benchmarking
- - reduced computing costs
6Software structure of an Earth System Model
Coupling infrastructure
Scientific core
Supporting software
7The long term view
Towards standard ESM support library(ies)
Climate science work
Today
Earth System model (Science support
environment)
Modeller
IT expert
Fortran Compiler
Hardware
8The PRISM project
- Program for integrated Earth System Modelling
- 22 partners
- 3 Years, from Dec 2001 - Nov 2004
- 5 Mill. funding, FP5 of the EC (80 py)
- Coordinators G.Brasseur and G.Komen
9System specifications
The modelers/users
- requirements - beta testing - feedback
The science
- General principles - Constraints from
physical interfaces,
PRISM infrastructure
The community models
The technical developments
- Coupler and I/O - Compile/run
environment - GUI - Visualisation and
diagnostics
- Atmosphere - Atmos. Chemistry - Ocean
- Ocean biogeochemistry - Sea-ice -
Land surface -
Lets NOT re-invent the wheel !
10System specifications - the people
Reinhard Budich - MPI, Hamburg Andrea Carril -
INGV, Bologna Mick Carter - Hadley Center,
Exeter Patrice Constanza - MPI/MD,
Hamburg Jérome Cuny - UCL, Louvain-la-Neuve Damien
Declat - CERFACS, Toulouse Ralf Döscher - SMHI,
Stockholm Thierry Fichefet - UCL,
Louvain-la-Neuve Marie-Alice Foujols - IPSL,
Paris Veronika Gayler - MPI/MD, Hamburg
Eric Guilyardi - CGAM, Reading and LSCE Rosalyn
Hatcher - Hadley Center, Exeter Miles Kastowsky
MPI/BCG, Iena Luis Kornblueh - MPI,
Hamburg Claes Larsson - ECMWF, Reading Stefanie
Legutke - MPI/MD, Hamburg Corinne Le Quéré -
MPI/BCG, Iena Angelo Mangili - CSCS, Zurich Anne
de Montety - UCL, Louvain-la-Neuve Serge Planton
- Météo-France, Toulouse Jan Polcher - LMD/IPSL,
Paris René Redler, NEC CCRLE, Sankt
Augustin Martin Stendel - DMI, Copenhagen Sophie
Valcke - CERFACS, Toulouse Peter van Velthoven -
KNMI, De Bilt Reiner Vogelsang - SGI,
Grasbrunn Nils Wedi - ECMWF, Reading
Chair
11PRISM achievements (so far)
- Software environment (the tool box)
- a standard coupler and I/O software, OASIS3
(CERFACS) and OASIS4 - a standard compiling environment (SCE) at the
scripting level - a standard running environment (SRE) at the
scripting level - a Graphical User Interface (GUI) to the SCE
(PrepIFS, ECMWF) - a GUI to the SRE for monitoring the coupled
model run (SMS, ECMWF) - standard diagnostic and visualisation tools
- Adaptation of community Earth System component
models (GCMs) and demonstration coupled
configurations - A well co-ordinated network of expertise
- Community buy-in and trust-building
12The PRISM shells
13Adapting Earth System Components to PRISM
PRISM Model Interface Library Potential Model IO
Description
14Configuration management and deployment
15PRISM GUI remote functionality
Config.
PRISM Repositories (CSCS, MPI)
Internet
User
16Standard scripting environments
- Standard Compiling Environment SCE
17Data processing and visualisation
18Demonstration experiments
Platforms
Assembled Coupled models
19Development coordinators
- The coupler and I/O - Sophie Valcke (CERFACS)
- The standard environments - Stephanie Legutke
(MPI) - The user interface and web services - Claes
Larsson (ECMWF) - Analysis and visualisation - Mick Carter (Hadley
Centre) - The assembled models - Stephanie Legutke (MPI)
- The demonstration experiments - Andrea Carril
(INGV)
20Community buy-in
- Growing !
- Workshops and seminars
- 15 pionneer models adapted (institutes
involvement) - 9 test super-computers intrumented
- Models distributed under PRISM env. (ECHAM5, OPA
9.0) - Community programmes relying on PRISM framework
(ENSEMBLES, COSMOS, MERSEA, GMES, NERC,)
- To go further
- PRISM perspective maintain and develop tool box
- Institute perspective get timing and
involvement in next steps right
21Collaborations
- Active collaborations
- ESMF (supporting software, PMIOD, MOM4)
- FLUME (PRISM software)
- PCMDI (visualisation, PMIOD)
- CF group (CF names)
- NERC (BADC CGAM) (meta-data, PMIOD)
- MD, MPI (data)
- Earth Simulator (install PRISM system V.0)
PRISM has put Europe in the loop for
community-wide convergence on basic standards in
ES modelling
22PRISM Final Project Meeting De Bilt,
October 7-8, 2004
23The future
- PRISM has delivered a tool box, a network of
expertise and demonstrations - Community buy-in growing
- Key need for sustainability of
- tool box maintenance/development (new features)
- network of expertise
-
-
PRISM sustained initiative