openfoam at fm– lth€¦ · – gamg vs bicg – bottlenecks ... y. liu, hybrid parallel...

28
OpenFOAM at FM– LTH Erdzan Hodzic Division of Fluid Mechanics, Department of Energy Sciences, Lund University FM Seminars: 16-march-2016 This offering is not approved or endorsed by ESI® Group, ESI-OpenCFD® or the OpenFOAM® Foundation, the producer of the OpenFOAM® software and owner of the OpenFOAM® trademark.

Upload: dokhuong

Post on 02-Apr-2018

232 views

Category:

Documents


3 download

TRANSCRIPT

OpenFOAM at FM– LTH

ErdzanHodzic

DivisionofFluidMechanics,DepartmentofEnergySciences,LundUniversity

FM Seminars: 16-march-2016

This offering is not approved or endorsed by ESI® Group, ESI-OpenCFD® or the OpenFOAM® Foundation, the producer of the OpenFOAM® software and owner of the OpenFOAM® trademark.

Outline

• HistoryandOverviewofOF

• OFatFluidMechanicsLTH

• WhyOF

• Clusterusage

• BestPractice

• ProposalforupcomingOFseminars

• Conclusions

2

e.g.OpenFOAM-v3.0+

History and Overview of OF:

3

Foam : Imperial College 1980s: H. Weller & H. Jasak

e.g.OpenFOAM-3.0.1

e.g.OpenFOAM-3.0.x

e.g.OpenFOAM-3.2-ext

OpenFOAM 1.0

H. Weller = OpenCFD & ESI

H. Jasak = Wikki.ltd & OpenFOAM-extend

Why OpenFOAM:

4H. Jasak et al. . OpenFOAM: A C++ Library for Complex Physics Simulations, International Workshop on Coupled Methods in Numerical Dynamics, IUC, Dubrovnik, Croatia, September 19th-21st-2007

• Expressiveandversatilesyntax

• Extensivecapabilities

• Openarchitectureandopensource(opensourcedevelopment)

• Objectorientation:

– DataEncapsulation:Groupingthedataandfunctionstogether,e.g.sparsematrixclass

– OperatorOverloading:high-levelfunctionality,i.e.identicalfunctionnameswithdifferentarguments

– ObjectfamiliesandRun-timeselection

– GenericProgramming:Independenceofactualdatatype

Why not-OpenFOAM:

5

• Littleofficialsupport

• Littlecoherenttrainingmaterial,manycourses/presentations/reports/forum-postsneededforananswer.

• Overwhelmingnumberofcontrollableparameters(inthebeginning)

OF at Fluid Mechanics LTH: Which fork/version/variant

6

/home/

Root password: *******

/srobi/

/soft/

/opt/

/moi OpenFOAM

foam

OF at Fluid Mechanics LTH: Which fork/version/variant

7

OpenFOAM

foam

OF at Fluid Mechanics LTH: Which fork/version/variant

8

OpenFOAM

foam

OF at Fluid Mechanics LTH: Which fork/version/variant

9

OpenFOAM

foam

OF at Fluid Mechanics LTH: Clusters

10

• Highlevelrepresentationofphysicallevelè parallelization,tensoralgebraandmeshmanipulation=automaticallyintegrated(hidden)commonforallsolvers.

• HPCperformancenotalwaysaddressed,cluster/casedependent

• OFparallelizationperformedbyMPIè wholecommunicationsysteminonesinglelibrary.(OpenMPI)

• WellsuitedforwiderangeofMPI-versions,hardwareandOS.

• Still,notwellunderstoodscalabilityandefficiency(ongoingdebate)

• Severalpreviousstudies=1e6-22e6cellsè linearscalabilityupto1024cores

O. Riviera and K. Furlinger, Parallel Aspects of OpenFOAM with Large Eddy Simulations, 2011 IEEE International Conference on High Performance Computing and Communications

OF at Fluid Mechanics LTH: Clusters

11T. Lucchini, Running OpenFOAM in parallel

• System/decomposeParDict

OF at Fluid Mechanics LTH: Clusters

12T. Lucchini, Running OpenFOAM in parallel

• System/decomposeParDict scotch

OF at Fluid Mechanics LTH: Clusters

13

• Example:pisoFOAM,2.15e6cells,Metisdecomposition

O. Riviera and K. Furlinger, Parallel Aspects of OpenFOAM with Large Eddy Simulations, 2011 IEEE International Conference on High Performance Computing and Communications

OF at Fluid Mechanics LTH: Clusters

14

• Results:

– GAMGvsBiCG

– bottlenecks

O. Riviera and K. Furlinger, Parallel Aspects of OpenFOAM with Large Eddy Simulations, 2011 IEEE International Conference on High Performance Computing and Communications

OF at Fluid Mechanics LTH: Clusters

15

• Otherstudiessamegeom:

– Nodiff.IntelvsGNU

– Regardless of OFvers.

M. Culpo, Current Bottlenecks in the scalability of OpenFOAM on Massively Parallel Clusters, www.prace-ri.eu

OF at Fluid Mechanics LTH: Clusters

16

• Otherstudies,dieselFOAM

Y. Liu, Hybrid Parallel Computation of OpenFOAM Solver on Multi-Core Cluster Systems

Local-test: using MPI_* commands

OF at Fluid Mechanics LTH:

17

OF best practise :

18https://github.com/wyldckat/wyldckat.github.io/wiki/OF23_IOPerformance_Analysis_2http://www.cfd-online.com/Forums/openfoam/136983-binary-gives-significant-performance-advantage-mesh-solve.html

https://github.com/OSCCAR-PFM/OSCCAR-DOC-PUBLIC/blob/master/openFoamUserManual_PFM.pdf

OF best practise: GPU and hyper- threading

19

OF best practise: GPU and hyper- threading

20S. Keough: Optimizing the Parallelization of OpenFOAM Simulation, Australian defence, Defence science and technology Organization

..increased overhead will outweigh any benefit gained from maximizing CPU utilization through the use of virtual cores.

OF best practise: GPU and hyper- threading

21S. Keough: Optimizing the Parallelization of OpenFOAM Simulation, Australian defence, Defence science and technology Organization

..increased overhead will outweigh any benefit gained from maximizing CPU utilization through the use of virtual cores.

OF at Fluid Mechanics LTH: Clusters

22

• OtherstudiesatFM,P.Iudiciani,M.Selewskietc.

– ~100.000cells/cpu

Cluster usage: Possible improvements

23

Proposal for FM-User exchange

24

• Jonny/4FM/

– fieldToVector&vectorToField

– outputMesh&outputVolume

– MyChemkinInputs=AllCHEMKIN*.inp/*.therm/*.tran

– postAverage-org=useOFlibSamplingutilities

Proposal for upcoming OF subjects

25

• FiveMainObjectsofOF

• BC:s=>bestpractice

• NumericalSchemes=>bestpractice

• SprayModellinginOF

• DynamicmeshesinOF

• CombustionModelsinOF

Useful links:

26

• TheOpenFOAMUserGuide:http://www.openfoam.org/docs/user/

• TheCFDOnline Forum:http://www.cfd-online.com/Forums/openfoam/

• TheOpenFOAMWiki: http://openfoamwiki.net/index.php/Main_Page

• TheCoCoons Project:http://www.cocoons-project.org/

• ChalmersUniversityOFcourse:http://www.tfd.chalmers.se/~hani/kurser/OS_CFD/

• 101Things toreadwhenstartingwithOF:http://www.sourceflux.de/blog/101-things-read-starting-openfoam/

Summary:

27

• LocalinstallationpfOF-forks

• 50.000-150.000cellspercpu

• UseMetisorScotch

• Savedatainbinary/uncompressed

• WaitwithGPU

• Read:”101ThingstoreadwhenstartingwithOF”

Thank you for your attention!