17/03/09 aeres luth. i. resources of luth ii. distributed computing : egee 3 iii. towards massively...

14
Intensive computation at LUTH TOWARDS GRAND CHALLENGE SIMULATIONS Yann Rasera 17/03/09 AERES LUTH

Upload: austin-mcdowell

Post on 12-Jan-2016

223 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

Intensive computation at LUTH

TOWARDS

GRAND CHALLENGE SIMULATIONS

Yann Rasera

17/03/09 AERES LUTH

Page 2: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

Intensive computation

I. Resources of LUTH

II. Distributed computing : EGEE 3

III. Towards Massively Parallel Processing

IV. Grand Challenge simulations

17/03/09 AERES LUTH

Page 3: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

Physics and astrophysics theory

Numerical algorithm development

Simulations and analysis on supercomputers and grids

Framework for the interpretation of observational data

Gravity, plasma physics, galaxy formation, interstellar medium chemistry, solar wind MHD

Spectral methods, Poisson solver, radiative transfer, chemistry solver

Massively parallel runs, hybrid simulations, distributed computing

HESS, ALMA, Hershell, Planck, COROT, GAIA

INTENSIVE COMPUTATION

AT LUTH

17/03/09 AERES LUTH

Page 4: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

I. RESOURCES OF LUTH

• Local computing resources: 222 cores and 33 TB

• Important use of the SIO mesocenter

• Use and active participation to EGEE grid development

• Many parallel codes written or developed locally

• Allocations on the three main supercomputing centers in France (ranked 14th, 16th, and 48th in the top500)

• Engineer in scientific computing: expert in code parallelization

17/03/09 AERES LUTH

Page 5: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

II. DISTRIBUTED COMPUTING: EGEE 3

• PHYSICS AND CHEMISTRY OF INTERSTELLAR MEDIUM• Meudon PDR code (F. Le Petit, J. Le Bourlot, E.

Roueff)• UV radiative transfer – chemistry - thermal

processes• Detailed observations strong constraints• Explore space parameters (density, CR flux,

dust… )• hundreds of models in 3 days (instead of

several months )

• VERY HIGH ENERGY γ-RAYS EMISSION FROM AGN• SSC code (Katarzynski, K., J-P. Lenain, H. Sol)• Synchrotron Self-Compton emission• HESS observations• 25 parameters• 30 000 jobs (60 000 hours mono-cpu) in

three months only

ACTIVE PARTICIPATION AND USE OF A&A CLUSTER

17/03/09 AERES LUTH

Page 6: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

III. TOWARDS MASSIVELY PARALLEL PROCESSING

• SUPERNOVA REMNANTS AND JETS FROM YOUNG STARS• HYDRO-MUSCL (C. Nguyen, C. Cavet, C.

Michaut)• Hydrodynamics and cooling• Finite volume method: Riemann solver • Parallelization with MPI• Radiative transfer (under development)

• BINARY BLACK HOLES ORBITS• KADATH (P. Grandclément, E. Gourgoulhon, J.

Novak)• General relativity: spectral solver• Decomposition on Chebyshev polynomials• Parallel Computation of Jacobian column per

column • Inversion of Jacobian matrix (200 000×200

000)• Use MUMPS and SuperLU parallel libraries• Parallel version under development: scaling is

promising

17/03/09 AERES LUTH

Page 7: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

• GALAXY FORMATION• COSMO3D (J-M Alimi, S. Courty, F. Roy,

R. Teyssier, J-P Chièze, E. Audit)• Poisson, hydrodynamics, chemistry

solver• Domaine decomposition • Run on hundreds of processors

• MAGNETIC STELLAR ATMOSPHERE• CARATSTRAT (G. Alecian, M.J.Stift)• Polarized transfer, atomic diffusion,

abundance stratification• Radiative transfer and diffusion equation

solvers • Hybrid version MPI/ADA under

development• Up to 128 processes at CINES

17/03/09 AERES LUTH

Page 8: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

IV. GRAND CHALLENGE SIMULATIONS

AN EXAMPLE: THE DARK ENERGY UNIVERSE SIMULATION SERIESJ-M Alimi, Y. Rasera, F. Roy, J. Courtin, P-S Corasaniti, A. Füzfa, V. Boucher, F. Fraschetti, R. Teyssier

GOAL: Imprints of DARK ENERGY on COSMIC STRUCTURE FORMATION

17/03/09 AERES LUTH

Page 9: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

N-BODY SIMULATION SERIES

• THREE DARK ENERGY COSMOLOGIES• ΛCDM (standard model)• Quintessence with Ratra-Peebles potential (RPCDM)• Quintessence with Sugra potential (SUCDM)

Calibrated on latest WMAP CMB data and UNION SNIa data

• THREE BOX LENGTHES• 3.6 Gpc : good statistics on clusters• 900 Mpc : good statistics on Milky-Way size halos – Internal structure of clusters • 225 Mpc : small halos - internal structure - redshift evolution

Probe from cosmological to subgalactic scales

NINE SIMULATIONS WITH 1 BILLION PARTICLES EACH• Up to 7 billions resolution elements • Resolve scales from 4 kpc to 4 Gpc • Resolve halos from 3.1010 Msun to 8.1015 Msun• Up to 500 000 resolved halos per simulation• Up to 3 000 000 particles per halo

LARGEST DARK ENERGY SIMULATION SERIES TO DATE17/03/09 AERES LUTH

Page 10: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

SCALABILITY AND RUNS

• Initial conditions: MPGRAFIC (S. Prunet, Pichon C.) + QUINT (Y. Rasera)• N-body solver: RAMSES (R. Teyssier) + QUINT (Y. Rasera)• Quick power spectrum for tests: POWERGRID (S. Prunet) + PARALLEL (Y. Rasera) • Analysis: Parallel Friend of Friend halo finder (F. Roy) Developed for this run !!!

NEEDS A SUITE OF PARALLEL CODES WITH GOOD SCALABILITY

4096 processes – 512 MB memory per process only

NEEDS A LOT OF CPU-TIME5 000 000 hours mono-cpu (600 years) on Babel

(IDRIS) • Allocation possible thanks to Horizon Project• First to use Babel up to 24576 processors• Found I/O node problem and MPI bug• Many crashes of supercomputer Efficient restart

17/03/09 AERES LUTH

Page 11: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

LARGE VOLUME DATA AND POST-

PROCESSING

• LUTH computer room moved to gigabits connection• Bought recently: backup system (10 TB) + horizon 2 server (7 TB)• 13 TB are stored locally + 5 TB of additional copy

NEEDS A GOOD NETWORK AND BACKUP SYSTEM 216 snapshots+ 6 lightcones+3 samples = 40 TB

NEEDS TO ANALYSE AND ORGANIZE DATACreation of a parallel halo finder (F. Roy)

• Parallel version using domain decomposition • Tested up to 20483 particles and 4096 processes • Sort particles on a region or halo basis • Subsequent analysis is therefore communication-free

NEEDS TO DIFFUSE DATADark energy universe virtual observatory

• Project: Website, « Dark energy virtual observatory », Horizon collaboration17/03/09 AERES

LUTH

Page 12: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

225 Mpc

56 Mpc

ΛCDMSugraRatra-Peebles

14 Mpc

Page 13: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

FIRST RESULTS

• Unprecedented range of masses and scales for dark energy simulations • Dark energy mass functions and power spectra with unprecedented accuracy• Differences between cosmologies: help breaking degeneracies between dark energy models• Differences with analytical predictions: help extending analytical models

17/03/09 AERES LUTH

ΛCDMSugraRatra-Peebles

z=0

z=1z=2.3

z=2.3z=0

z=1

Page 14: 17/03/09 AERES LUTH. I. Resources of LUTH II. Distributed computing : EGEE 3 III. Towards Massively Parallel Processing IV. Grand Challenge simulations

CONCLUSION

• Intensive computation is a strong component at LUTH

• LUTH is an active participant of the grid EGEE III in astrophysics• leading actor for the «A&A cluster» • Use of the grid up to 30000 jobs in 3 months

• LUTH is moving towards Massively Parallel Processing• Several parallel applications up to 120 processes• Development and scalability tests to move to higher number of

processes

• LUTH has already performed one Grand Challenge simulation:• up to 4096 processes• 5 millions hours mono-cpu

• LUTH is preparing for petaflop computing17/03/09 AERES LUTH