The PRISM infrastructure
The PRISM infrastructureThe PRISM infrastructurefor Earth system modelsfor Earth system models
Eric Guilyardi, CGAM/IPSLand the PRISM Team
• Background and drivers• PRISM project achievements• The future
The PRISM infrastructure
Why aWhy a common software infrastructure ?common software infrastructure ?
• Earth system modelling expertise widely distributed
Geographically
Thematically
The PRISM infrastructure
Why aWhy a common software infrastructure ?common software infrastructure ?
• Earth system modelling expertise widely distributed – Scientific motivation = facilitate sharing of scientific
expertise and of models– Technical motivation = the technical challenges are
large compared with available effort• Need to keep scientific diversity while increasing
efficiency – scientific and technical
• Need for concerted effort in view of initiatives elsewhere:
– The Frontier Project, Japan– The Earth System Modelling Framework, US
The PRISM infrastructure
« Share Earth System Modelling software infrastructure across community »
To:• share development, maintenance and support• aid performance on a variety of platforms• standardize model software environment• ease use of different climate model components
PRISM conceptPRISM concept
The PRISM infrastructure
Expected benefitsExpected benefits
• high performance ESM software, developed by dedicated IT experts, available to institutes/teams at low cost:
- helps scientists to focus on science- helps key scientific diversity (survival of smallers groups)
• Easier to assemble ESMs based on community models
• shared infrastructure = increased scientific exchanges
• computer manufacturers inclined to contribute:- efficiency (porting, optimisation) on variety of platforms- next generation platforms optimized for ESM needs- easier procurements and benchmarking- reduced computing costs
The PRISM infrastructure
Software structure of an Earth System ModelSoftware structure of an Earth System Model
Running environment
Coupling infrastructure
Scientific core
Supporting software
Share(i.e. f90)
The PRISM infrastructure
Hardware
Fortran Compiler
Earth System model(Science + support +
environment)
The long term viewThe long term view
Today
Modeller
IT expert
Standard support Library (incl. Env.)
Hardware
Fortran Compiler
Earth System model(Science)
Tomorrow
Climate science work
Towards standard ESM support library(ies)
The PRISM infrastructure
The PRISM projectThe PRISM project
• Program for integrated Earth System Modelling– 22 partners– 3 Years, from Dec 2001 - Nov 2004– 5 Mill. € funding, FP5 of the EC (~80 py)– Coordinators: G.Brasseur and G.Komen
The PRISM infrastructure
PRISM PRISM infrastructureinfrastructure
- General principles - Constraints from physical interfaces,…
- Coupler and I/O - Compile/run environment - GUI - Visualisation and diagnostics
- requirements - beta testing - feedback
The community models
The science :
The technical developments:
The modelers/users:
- Atmosphere - Atmos. Chemistry - Ocean - Ocean biogeochemistry - Sea-ice - Land surface - …
Let’s NOT re-invent the wheel !
System specificationsSystem specifications
The PRISM infrastructure
System specifications - the peopleSystem specifications - the people
Reinhard Budich - MPI, Hamburg
Andrea Carril - INGV, Bologna
Mick Carter - Hadley Center, Exeter
Patrice Constanza - MPI/M&D, Hamburg
Jérome Cuny - UCL, Louvain-la-Neuve
Damien Declat - CERFACS, Toulouse
Ralf Döscher - SMHI, Stockholm
Thierry Fichefet - UCL, Louvain-la-Neuve
Marie-Alice Foujols - IPSL, Paris
Veronika Gayler - MPI/M&D, Hamburg
Eric Guilyardi* - CGAM, Reading and LSCE
Rosalyn Hatcher - Hadley Center, Exeter
Miles Kastowsky MPI/BCG, Iena
Luis Kornblueh - MPI, Hamburg
Claes Larsson - ECMWF, Reading
Stefanie Legutke - MPI/M&D, Hamburg
Corinne Le Quéré - MPI/BCG, Iena
Angelo Mangili - CSCS, Zurich
Anne de Montety - UCL, Louvain-la-Neuve
Serge Planton - Météo-France, Toulouse
Jan Polcher - LMD/IPSL, Paris
René Redler, NEC CCRLE, Sankt Augustin
Martin Stendel - DMI, Copenhagen
Sophie Valcke - CERFACS, Toulouse
Peter van Velthoven - KNMI, De Bilt
Reiner Vogelsang - SGI, Grasbrunn
Nils Wedi - ECMWF, Reading* Chair
The PRISM infrastructure
1. a standard coupler and I/O software, OASIS3 (CERFACS) and OASIS42. a standard compiling environment (SCE) at the scripting level3. a standard running environment (SRE) at the scripting level4. a Graphical User Interface (GUI) to the SCE (PrepIFS, ECMWF)5. a GUI to the SRE for monitoring the coupled model run (SMS, ECMWF)6. standard diagnostic and visualisation tools
PRISM achievements (so far):
• Adaptation of community Earth System component models (GCMs) and demonstration coupled configurations
• A well co-ordinated network of expertise
• Community buy-in and trust-building
• Software environment (the tool box):
The PRISM infrastructure
The PRISM shellsThe PRISM shells
Standard Running EnvironmentStandard Running Environment
Standard Compile Environ.Standard Compile Environ.
PSMILe (coupling and I/O)PSMILe (coupling and I/O)
Historic I/O Historic I/O
Outer shellsOuter shells
Inner shellInner shell
Scientific coreScientific core
The PRISM infrastructure
Adapting Earth System Components to PRISMAdapting Earth System Components to PRISM
SRESRESCESCE
UserUserInterfaceInterface
Levels of Levels of adaptation adaptation
++ PSMILe + PMIODPSMILe + PMIOD
PRISM Model Interface LibraryPRISM Model Interface LibraryPotential Model IO DescriptionPotential Model IO Description
The PRISM infrastructure
Configuration management and deploymentConfiguration management and deployment
Binary executablesBinary executables
SCESCE
UserUserInterfaceInterface
SRESRE
Transf.
DriverDriver
disks
The PRISM infrastructure
InternetInternet
PRISM GUI remote functionalityPRISM GUI remote functionality
User
PRISMPRISMRepositoriesRepositories(CSCS, MPI)(CSCS, MPI)
Config.Config.
Inst
rum
ente
d
Inst
rum
ente
d
site
ssi
tes
Transf.
DriverDriver
Architecture A
Transf.
DriverDriver
Architect. B
Transf.
DriverDriver
Architect. C
DeployDeploy
PrepIFS/SMSWeb services
The PRISM infrastructure
Standard scripting environmentsStandard scripting environments
• Standard Compiling Environment SCE
• Standard Runtime Environment SRE
The PRISM infrastructure
Demonstration experimentsDemonstration experiments
PlatformsPlatforms
AssembledAssembledCoupled modelsCoupled models
CGAM contribution(Jeff Cole)
The PRISM infrastructure
Development coordinatorsDevelopment coordinators
• The coupler and I/O - Sophie Valcke (CERFACS)
• The standard environments - Stephanie Legutke (MPI)
• The user interface and web services - Claes Larsson (ECMWF)
• Analysis and visualisation - Mick Carter (Hadley Centre)
• The assembled models - Stephanie Legutke (MPI)
• The demonstration experiments - Andrea Carril (INGV)
The PRISM infrastructure
Community buy-inCommunity buy-in• Growing !
– Workshops and seminars– 15 pionneer models adapted (institutes involvement)– 9 test super-computers intrumented – Models distributed under PRISM env. (ECHAM5, OPA 9.0)
– Community programmes relying on PRISM framework (ENSEMBLES, COSMOS, MERSEA, GMES, NERC,…)
• To go further:– PRISM perspective: maintain and develop tool box
– Institute perspective: get timing and involvement in next steps right
The PRISM infrastructure
CollaborationsCollaborations
Active collaborations:• ESMF (supporting software, PMIOD, MOM4)• FLUME (PRISM software)• PCMDI (visualisation, PMIOD)• CF group (CF names)• NERC (BADC & CGAM) (meta-data, PMIOD)• M&D, MPI (data)• Earth Simulator (install PRISM system V.0)
Coupling infrastructure
Scientific core
Supporting software
Running environment
PRISM has put Europe in the loop for community-wide convergence on basic standards in ES modelling
The PRISM infrastructure
The futureThe future
• PRISM has delivered a tool box, a network of expertise and demonstrations
• Community buy-in growing• Key need for sustainability of
– tool box maintenance/development (new features)– network of expertise
PRISM sustained initiative
Set-up meeting held in Paris Oct 27 2004