outline
Post on 30-Dec-2015
24 Views
Preview:
DESCRIPTION
TRANSCRIPT
Sophie Valcke, CERFACSand the PRISM team across Europe
ENES and PRISM: A European approach to Earth System modelling
Computing in Atmospheric Sciences Workshop 2003 Slide 2
Outline
•ENES•The PRISM project:
•partners•goals•model components•standard physical interfaces•architecture and GUI
•PRISM first coupler: Oasis3 (08/2003):•configuration •communication•interpolations/transformations
•PRISM final coupler (12/2004)•configuration, •communication,•interpolations/transformations
Computing in Atmospheric Sciences Workshop 2003 Slide 3
ENES
Climate research in Europe:
•Societal/political needs in Europe high (IPCC, mitigation,…)•Recognised excellence; scientific diversity (models, approaches,…) How to organise Earth System modelling in Europe ?
•The « one-big-centre-does-it-all » not suitable:- expertise lies within national centres- diversity is key to research
•Need for shared infrastructures: - software (PRISM) - hardware
Computing in Atmospheric Sciences Workshop 2003 Slide 4
ENES
ENES: European Network for Earth System modelling
• «Think tank » to organize, plan and seek funding for efficient distributed Earth System modelling in Europe
• Follows a EuroClivar recommendation
• Open to any institute/industry (MoU)
• Coordinated by Guy Brasseur (MPI, Hamburg)
• 50 members so far (http://enes.org)
Computing in Atmospheric Sciences Workshop 2003 Slide 5
A long term strategy:
1. Jointly develop shared software infrastructure for Earth System modelling
2. Provide European integrated service to access and use this infrastructure
3. Provide and manage hyper-computing access by 2010
PR
ISM
ENES
Computing in Atmospheric Sciences Workshop 2003 Slide 6
The PRISM project
• PRISM: PRogram for Integrated Earth System Modelling
• A European project, started December 2001, funded
for 3 years by the European Commission (4.8 M€)
• Coordinators:
• Guy Brasseur (MPI, Hamburg)
• Gerbrand Komen (KNMI, Amsterdam)
• PRISM Director: Reinhard Budich (MPI)
Computing in Atmospheric Sciences Workshop 2003 Slide 7
=> 22 partners: leading climate research institutes and computer vendors
•MPG-IMET, Germany•KNMI, Netherlands•MPI-MAD, Germany•Met-Office, UK•UREADMY, UK• IPSL, France•Météo-France, France•CERFACS, France•DMI, Denmark•SHMI, Sweden
•NERSC, Norway •ETH Zurich, Switzerland • ING, Italy•MPI-BGC, Germany•PIK, Germany•ECMWF, Europe•UCL-ASTR, Belgium•NEC Europe•FECIT/Fujitsu•SGI Europe•SUN Europe
PRISM partners
Computing in Atmospheric Sciences Workshop 2003 Slide 8
Provide software infrastructure
to easily assemble Earth system coupled models based on existing state-of-art European component models
launch/monitor complex/ensemble Earth system simulations
PRISM goals
Help climate modellers spend more time on science:
Computing in Atmospheric Sciences Workshop 2003 Slide 9
Define and promote technical and scientific standards for ESM:
Scientific standards:• Physical interfaces between model components• Global Earth System parameters
Technical standards: • Compiling, running, post-processing environment• Architecture and Graphical User Interface • Coupler and I/O software• Data and grid format• Coding and quality
Interaction with other groups (ESMF, ESG/NOMADS, CF, RPN?,...)
PRISM goals
Computing in Atmospheric Sciences Workshop 2003 Slide 10
Atmosphere:Météo-France (ARPEGE), MPG-IMET (ECHAM), IPSL (LMDZ), MetOffice (Unified Model), UREADMY, INGV
Atmospheric Chemistry:MPG-IMET, UREADMY, IPSL, MetOffice, Météo-France, KNMI
Land Surface: IPSL (Orchidée), MetOffice, MPG-IMET, UREADMY, Météo-France (ISBA)
Sea Ice:NERSC, UCL-ASTR, MetOffice, IPSL, MPG-IMET
Ocean Biogeochemistry:MPI-BGC, IPSL, MPG-IMET, MetOffice
Ocean: UREADMY, MetOffice (FOAM), MPI-M (HOPE), IPSL (OPA/ORCA)
Regional Climate:SHMI, DMI, MetOffice
Coupler:CERFACS, NEC,
CCRLE, FECIT, SGI, MPI-MAD
PRISM model components
Computing in Atmospheric Sciences Workshop 2003 Slide 11Ocean model
1- Rainfall + int. energy2- Snowfall + int. energy3- Incoming solar radiat.4- Solar zenith angle5- Fraction of diffuse solar radiation6- Downward infrared radiation7- Sensitivity of atmos temp. & humidity to surf. fluxes
1*- Sensible heat flux2*- Surf. emissivity 3*- Albedo, direct4*- Albedo, diffuse5*- Surf. radiative temp.6*- Evaporation + int. energy [+ Qlat]
7*- Wind stress8- Subgrid fractions
1- Surface pressure2-4 Air temperature, humidity and wind5- Wind module6- Height of these 4 variables
1*- Cd
2*- Ce
3*- Ch
1x- Non solar heat flux2x- Solar radiation3x- Fresh water flux4x- Salt flux5x- Wind stress6x- U^37x- Mass of snow and ice8- Subgrid fractions
1*- Surf. Temp2*- Surf. Roughness3*- Displacement height4x- Surface velocity
1- Continental runoff + internal Energy
1-2 Temp./Salinity at sea-ice base3- Sea surface temperature4- Surf. radiative temp.5- Surface ocean current6- Sea surface salinity7- Surface height8- Absorbed solar radiation (in first oceanic layer)
Iceberg parameters
1 2
3
4 5
67
8
Ocean surface module
Surface layer turbulence
Sea ice model wave model
+3
4
1
2
Atmosphere model
Land surface model
Note on subgrid fractiondependance:<>x- Sea Ice categories (incl. open ocean)<>*- Sea Ice or Land Surf. categories
A proposal for PRISM standard O-A-SI physical interfaces:
Computing in Atmospheric Sciences Workshop 2003 Slide 12
PRISM central server + PRISM local sitesGUI: adaptation of ECMWF prepIFS and SMS scheduler
User/developer
Prism centralserver
Prism localsite
Web
Web
PRISM architecture and GUI:
Computing in Atmospheric Sciences Workshop 2003 Slide 13
•Based on Oasis developed since 1991 in CERFACS to couple existing GCMs developed independently at the time:
•Models at relatively low resolution (~10000-20000 pts)•Small number of 2D coupling fields (~10) •Low coupling frequency (~once/day)
flexibility was very important, efficiency not so much!
•performs:synchronisation of the component models coupling fields exchange and interpolation I/O actions
•tested on VPP5000, NEC SX5, SGI Octane and O3000, Compaq Alpha cluster, Linux PC cluster (MPI-Globus)
PRISM first coupler: Oasis3
Computing in Atmospheric Sciences Workshop 2003 Slide 14
Oasis regular users:• CERFACS• METEO-FRANCE (France)• IPSL- LODYC, LMD, LSCE
(France)• ECMWF (UK)• UCL (Belgium)• MPI - M&D (Germany)• SMHI (Sweden)• BMRC (Australia)• IRI (USA)
•…and :•AWI (Germany) •PIK (Germany)•Met Office (UK)•UGAMP (UK)•KNMI (Netherlands)•CSIRO (Australia)•FSU/COAPS (USA)•LASG (China)•JAMSTEC (Japan)•…?
PRISM project first coupler: Oasis3
Computing in Atmospheric Sciences Workshop 2003 Slide 15
PRISM project first coupler: Oasis3
Oasis3 configuration:•In text file namcouple read by Oasis3 at the beginning of the run, e.g.
•total run time
•number and names of component models
•number and names of coupling fields; for each field:
•source and target symbolic name
•coupling and/or I/O status,
•coupling or I/O period
•transformations/interpolations
• …
•Component model grid (longitudes, latitudes, masks, mesh surfaces, mesh corner locations) must be available in binary or NetCDF files.
Computing in Atmospheric Sciences Workshop 2003 Slide 16
PRISM project first coupler: Oasis3
Oasis3 communication:•New PRISM System model interface (PSMILe) based on MPI1 or MPI2 message passing
•Parallel communication between parallel models and Oasis3 interpolation process
A
A
A
B
B
B
A
A
A
file
A
A
A
O
O
O
O
Oasis3
•Direct communication between models with same grid and partitioning
•I/O functionality (automatic switch between coupled and forced mode)
•Modularity: at each model time step, exchange is performed or not depending on user’s specifications in namcouple.
• Automatic time integration depending on user’s specification
Computing in Atmospheric Sciences Workshop 2003 Slide 17
•Oasis3 interpolations/transformations
PRISM project first coupler: Oasis3
A
A
A
O
O
O
O
Oasis3Oasis3=> performed by separate sequential process
=> on 2D scalar fields only
• Interfacing with RPN Fast Scalar INTerpolator package
• nearest-neighbour, bilinear, bicubic for regular Lat-Lon grids
• Interfacing with SCRIP1.4 library (Los Alamos Software Release LACC 98-45):
• nearest-neighbour, 1st and 2nd order conservative remapping for all grids • bilinear and bicubic interpolation for «logically-rectangular» grids
• Bilinear and bicubic interpolation for reduced atmospheric grids• Other spatial transformations: flux correction, merging, etc.• General algebraic operations
Computing in Atmospheric Sciences Workshop 2003 Slide 18
PRISM project final coupler
•Higher resolution, parallel and scalable models•Higher coupling frequencies desirable •Higher number of models and (3D) coupling fields
Need to optimise and parallelise the coupler
The final PRISM coupler will be composed of: •a Driver•a Transformer•a new PRISM System Model Interface Library
Computing in Atmospheric Sciences Workshop 2003 Slide 19
PRISM project final coupler
Final coupler configuration (XML files):•The user chooses the models through the GUI.
• Each component model comes with:• an Application Description (AD) • a Potential Model Input and Output Description (PMIOD).
•The user configures his particular coupled run through the GUI : • total run time, etc.• for each field described in the PMIOD:
•source or target•coupling or I/O status, •coupling or I/O period•transformations/interpolations, etc.
•Based on the user’s choice, the GUI produces the XML configuration files.
•At run-time the Driver reads and distributes configuration information.
•The PSMILes and Transformer act accordingly to the user’s specifications.
Computing in Atmospheric Sciences Workshop 2003 Slide 20
PRISM project final coupler
Final coupler communication:
•More elaborate PSMILe based on MPI1 or MPI2 (grid definition transferred through the PSMILe API)
•Modularity as for Oasis3: at each model time step, exchange is performed or not depending on user’s specifications.
•As for Oasis3, automatic time integration•As for Oasis3, I/O functionality (automatic switch between coupled and forced mode)
•Parallel communication: as for Oasis3 + repartitioning.
OB
OB
OB
C
C
C
O1
O1
C
C
•Parallel calculation of interpolation weights and addresses in the source PSMILe
•Extraction of useful part of source field only.
Computing in Atmospheric Sciences Workshop 2003 Slide 21
•Final coupler interpolations/transformations => as for Oasis3 +
•Support of vector fields
•Support of 3D fields
•More flexibility for field combination/merging, etc.
The PRISM project final coupler
Computing in Atmospheric Sciences Workshop 2003 Slide 22
Conclusions
•ENES and PRISM
•PRISM first coupler: Oasis3, now available
•PRISM final coupler prototype due 11/2003
•PRISM final coupler due 12/2004
… and after PRISM ?
•Follow-on project re-submitted at next EU-call in 2004 (CAPRI rejected)International interaction and collaboration essential in all cases!
http://www.enes.org ; http//www.cerfacs.fr/PRISM/prism.htmlvalcke@cerfacs.fr
Computing in Atmospheric Sciences Workshop 2003 Slide 23
PRISM project first coupler: Oasis3
•Oasis3 communication; PSMILe API: :
•Initialization: call prism_init_comp(…)
•Retrieval of component model local communicatorcall prism_get_localcomm (…)
•Coupling or I/O field declarations (name, type, shape, local partition, …)
call prism_def_var(field_idx, …)
•End of definitioncall prism_enddef(…)
•In model time stepping loop, coupling or I/O field exchangecall prism_put(field_id1, time, field_array1, ierror), call prism_get(field_id2, time, field_array2, ierror)
=> Automatic averaging/accumulation, coupling exchange, and/or I/O depending on time argument and user’s specifications in namcouple
•Termination:call prism_terminate(…)
Computing in Atmospheric Sciences Workshop 2003 Slide 24
PRISM project final coupler
Final coupler communication; PSMILe API: :
As for Oasis3 PSMILe +•Definition of grid (1D, 2D, 3D)
call prism_def_grid(…)
call prism_set_corners(…)
call prism_set_mask(…)
•Definition of grid for vector and bundle fieldscall prism_set_vector(…)
Call prism_set_subgrid(…)
•Coupling or I/O field declarations support vector, bundles, 1D, 2D and 3D fields•Extraction of SCC and SMIOC information:
call prism_get_persist(…)
ATM SMIOC V1 : from OCE, T1V2: to OCE, T2V3 : to LAND
user
LAND SMIOC V3 : from ATMV4 : from fileV4
user
OCE SMIOCV1 : to ATM, T1V2 : from ATM, T2
user
Driver
TOCE
ATM
LANDfileV4Definition Phase
OCE PMIODV1: out, metadata V1V2: in, metadata V2
ATM PMIODV1: in, metadata V1V2: out, metadata V2V3: out, metadata V3
LAND PMIODV3: in, metadata V3V4: in, metadata V4
LAND AD
ATM AD
OCE AD
V2
V1
V1
V3
V4
Deployment Phase
V2user
Composition Phase
user
SCC ATM:...OCE:...LAND:...
user
Deployment Phase
V3
V6
V2 V2
V1 V1V4
V5V5
V7
V7
Mj SMIOC
V1 : cf SCCV4 : cf SCC
user
Mk SMIOC V4 : cf SCCV5 : in, fileV5, TnlV5k
user
Mi SMIOCV1 : cf SCCV2 : cf SCCV3 : in, fileV3, Tli
user
Composition Phase
SMIOC: Specific Model Input and Output Config.
user
SCC V1 : Mi -> Mj, Tli, TnlijV2 : Mi -> Mj, Tij (+ V6)V4 : Mj -> Mk
user
user
SCC: Specific Coupling Configuration
Driver
T
Mi Mj
Mk
Definition Phase
Mi PMIODV1: out, metadata V1V2: out, metadata V2V3: in, metadata V3
Mj PMIODV1: in, metadata V1V4: out, metadata V4
Mk PMIODV4: in, metadata V4V5: in, metadata V5
fileV6
fileV3
fileV5
PMIOD: Potential Model Input and Output DescriptionMi: Model i T: Transformer
Computing in Atmospheric Sciences Workshop 2003 Slide 27
Coupling infrastructure
Supporting software
Scientificcode
Running environment
Software structure of an Earth System Model
Share
Computing in Atmospheric Sciences Workshop 2003 Slide 28
On going PRISM / ESMF collaboration
Coupling infrastructure
Supporting software
User code
Running environment
PRISM
ESMF
Earth System Model
top related