element analysis for ocean tidal

Upload: xubair-wajid

Post on 06-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 Element Analysis for Ocean Tidal

    1/23

    INTERNATIONALOURNAL ORNUMERICALMETHODSN FLUIDS,VOL. 24. 137I_1389 (1997\

    PARALLELFINITE ELEMENTMETHODSFOR LARGE-SCALECOMPUTATIONOF STORMSURGESAND TIDAL FLOWSKAZUO KASHIYAMA, KATSUYA SAITOH,I MAREK BEHR2AND TAYFUN E. TEZDUYAR2

    ^ tDepartmcnt of Civil Engineering, Chuo IJniversity, t-13-27 Kasuga Bunkyo-ku, Tokyo t 12, Japan'AEM/AHPCRC, University of Minnesota, t IN WashingtonAvenue South, Minneapolis, MN 55415, U.S.A.

    SUMMARYMassively parallel finite element methods for large-scale computation of storm surges and tidal flows arediscussedhere. The finite element computations,carried out using unstructuredgrids, are based on a three-stepexplicit formulation and on an implicit space-time formulation. Parallel implementationsof theseunstructuredgrid-based ormulations are carried out on the Fujitsu Highly Parallel Computer AP1000 and on the ThinkingMachinesCM-5. Simulations of the storm surgeaccompanying he Ise-Bay typhoon in 1959and of the tidal flowin Tokyo Bay serve as numerical examples. The impact of parallelization on this type of simulation is alsoinvestigated.The presentmethodsare shown to be useful and powerful tools for the analysis of storm surgesandtidal flows. O 1997by John Wiley & Sons,Ltd.Int. J. Numer.Meth. Fluids.2A:1371-1389. 1997No. of Fisures: 22. No. of Tables:0. No. of References: 6.KEYwoRDS: parallel finite element methodt three-stepexplicit formulation; implicit space-time ormulation; storm surge; idalflow

    I. INTRODUCTIONStorm surges s a phenomenonn which the sea evel n a near-shoreisessignificantlybecause fthepassage f a typhoon or low atmospheric ressure. his can causeenorrnous amage n major baysand harbours. idal flows in oceanbays are essviolent, yet their understandings also mportant othe design of shoreline and offshore structure. For the study of storm surge, computations werecarried out in the past by some of the presentauthors and also other researchers.l'2 idal flowsimulationswerepreviously eported n References and 4. The finite elementmethod s a powerfultool in such simulations,since t is applicable o complicatedwater and land configurations nd isable to represent uch configurations ccurately. n practicalcomputations, specially n the caseofstorm surgeanalysis, he computational omain s largeand he computations eed o be carriedoutover ong time periods.Therefore his typeof problembecomes uite arge-scale nd t is essentialouse methodswhich are as efficient and fast as the availablehardwareallows.In recentyears,massivelyparallel inite elementcomputations avebeensuccessfully pplied oseveral arge-scalelow problems.3'shesecomputations emonstratedhe availabilityof a new evelof finite elementcapability o solvepractical low problems.With the need or a high-performancecomputingenvironment o carry out simulations or practicalproblems n stormsurgeanalysis, n this*Correspondenceto: K. Kashiyama, Department of Civil Engineering, Chuo University,ll2, Japanccc 027-209191rz13'7-19 $ 7.50A 1991 y JohnWiley& Sons, td.

    l-13-27 Kasuga,Bunkyo-ku, Tokyo

    Received15 May 1996Revised12 July 1996

  • 8/2/2019 Element Analysis for Ocean Tidal

    2/23

    1372 K. KASHIYAMAET ALpaperwe presentand employ a parallel explicit finite element method for computationsbasedonunstructured rids.The finite elementcomputations re basedon a three-step xplicit formulation'lofthegoverningequations. n these omputationswe use he selective umping techniqueor numericalstabilization.Parallel mplementation f this unstructured-grid-basedormulation s carriedout on theFujitsu Highly ParallelComputerAPl000. As a test problem,we carry out simulationof the stormsurge accompanying he Ise-Bay typhoon in 1959. The computed esults are comparedwith theobserved esults.The effectof parallelization n the efficiencyof the computationss also examined.

    The computationof the second lassof problems,nvolving tidal flows, s accomplished ere witha stabilized mplicit finite element method basedon the conservation ariables.This stabilizationmethod s basedon the streamlineupwind/Petrov-Galerkin(SUPG) formulation for compressibleflows, which was originally introduced n Reference for incompressiblelows and in Referencefor the Euler equationsof compressible lows. This methodology was later supplementedwith adiscontinuity-capturing term in References8 and 9 and then extended n References 0 and 1 to theNavier-Stokes equations of compressible lows. The time-dependentgoverning equations arediscretized singa space-time ormulationdeveloped or fixed domains n References 2 and 13 andfor deforming domains in Reference 14. The present data-parallel implementation makes noassumptionsabout the structure of the computational grid and is written for the Thinking MachinesCM-5 supercomputer. s a test problem,simulationof the tidal flow in Tokyo Bay is carried out.

    2. GOVERNING EQUATTONSThe storm surgephenomena an be modelledusing he shallow water equations,which are obtainedfrom the conservation f momentumandmass, ertically ntegrated, ssuminga hydrostatic ressuredistribution:

    i , t , + u , u , r + g ( ( - ( o ) , , + j 1 L - , l t ' l ' r , - v ( u i j t u i ) . 1 : 0 , ( 1 )p ( h + O p ( h + o( + t ( h + o u i l , i : 0 , (2 )

    where a, is the mean horizontal velocity, ( is the water elevation, r is the water depth,g is thegravitationalacceleration, o is the increase n water elevation corresponding o the atmosphericpressuredrop, (r.), is the surface shear stress, 16), s the._bottom hearstressand v is the eddyviscosity.The increase o can be given by Fujita's formula" as, 7 L Pto : wg-J| + ( r l rn f l '

    where Ap is the pressuredrop at the centre of the typhoon, p is the density of fluid, r is the distancefrom the centre of the typhoon and re is the radius of the typhoon.The surfaceshearstress an be siven as(t.), : paywiJ(wkwk), (4 )

    wherepa is the densityof air, y is the drag coefficientand u/i is the wind velocity 10 m above hewater surface.The wind velocity can be evaluatedusing the expressions

    wt: -93t"t^0[x1 - (x1).] f cosg[x, (r2L]] Crvre-t lun, (5)C 'V^w2 -:; t- cosg[x1 ("r).] * sin0[x,- (rz).]] C2VreQ/R)", (6)

    (3)

    INT. J. NUMER.METH.FLUIDS.VOL 24: 137l-13891997) @ 1997 by John Wiley & Sons, Ltd

  • 8/2/2019 Element Analysis for Ocean Tidal

    3/23

    COMPUTATION OF STORM SURCES AND TIDAL FLOWS 1373where Z* is the gradientwind velocity, V, and V2 denote he velocity of the typhoon, xr). and(x2).denote he positionof the typhoon,0 is the gradientwind angle and R, C1 and C, areconstants. hegradientwind velocity is define as

    (7);)']'ll,:I[_,.It L \ I'.ml'n2g(15)r ; f ir , ' / (uouo),

    Ir r, 0, u,u,,,+(( (r).,ffi- #*) *+ ln,!.ilu(ui.1-t u1,i))onJ. uf ,dr 0,

    J r.rr+[(h or.r,],,]Q o,where .r|and(* denote he weighting unctionsand , represents oundary erms.

    where is the Coriolis coefficient.The bottom shearstress an be siven as

    M,p(p + B nB;,,u;(hy (l,) + C opr;ui(hy (r ) : 0.

    (8 )where n is the Manning coefficient.

    3. VARIATIONAL FORMULATIONSWe present two finite element formulations of the shallow water equations which have beenimplementedon parallel architectures. he first method is a three-step xplicit method for fixeddomains.The secondmethod s an mplicit stabilized pace-time ormulation.Although he examplespresentedn this paper nvolve fixed domainsonly, the latter (space-time) ormulation s seenas astep owardssolving an mportantclassof problemswhich involve deformingdomains.With the twoformulations included in this section addressingdifferent classesof problems, a cost/accuracycomparison s not performed;however, t is expected hat the explicit method or a given time stepsize will be more economical han the space-time ormulation f the domain is fixed. The implicitspace-time ormulation,on the otherhand,doesnot involve as much ime stepsize estriction due onumericalstability) as the explicit method.

    3.1. Three-Step xplicit Finite ElementMethodFor the finite element spatial discretizationof the governing equations he standardGalerkinmethod s used.The weak form of the governingequations an then be written as

    (e)( 0 )

    Using the three-node inear triangular elements or the spatialdiscretization, he following finiteelementequations an be obtained:M,pitBi-rop,lusuyiHaiGB-oB) ,,r(ffi) ,-ro,(ffi) ,*r",r,uu,: o, ( l )

    (r2)ac,1997 by John Wiley & Sons, Ltd INT. J. NUMER.METH.FLUIDS.YOLA: 137l-13891997)

  • 8/2/2019 Element Analysis for Ocean Tidal

    4/23

    1314 K. KASHTyAMAr AL.The coefficient matrix can be expressedas

    f fM, f : l^ o"opdo. Knflyj: I o"opol/dQ.J o - J of fH np i : g I O oO d. i dQ , Ton : I O oO ddO .JO J O

    f lSoipT u I O,,,O6,,dO u I Oo.ror.rei,rdf),- J o J o

    Bop,, [^ t"tr.,*r* . C,n,: I o,oBo].ide.Jo JC )where O denotes he shape function. The bottom stress erm is linearized and the water depth isinterpolated sing inear interpolation.For discretization n time the three-step xplicit time integrationscheme s employedusing theTaylor seriesexpansion

    F(r+ Ar ) F(t)+ L/lo . a^P rg lrr drtrl - ' A'at -Ti*e * , +o(Lf ) ' (13)where F is an arbitrary function and At is the time increment.Using the approximateequationup tothird-order accuracy, he following three-stepschemecan be obtained: 6

    r ( ,* { \ = a' toF( t \\ 3 / = r \ t ) * T a '

    r ( r * * ) : Ff t t NaFQ ^ t l3) .

    ( r4)\ - z l ' ' " ' 2 0 tF(/+ Ar) F(t) 6,TQ !-4'12.

    Equation 14) is equivalent o equation 13) and the method s referred o as the three-step aylor-Galerkinmethod.The stability imit of the method s l'5 times arger han that of the conventionaltwo-stepscheme.4'17etailsof this methodare given in Reference . Applying this scheme o thefinite elementequations,he following discretizedequations n time can be obtained:Step I

    u!pu^Bi'/'ulBu^pi!lr,r,,rrui1 HaiG'p(BBt,"r(ffir)'u-r.,(#*O)',*r.,r,r], (r5)

    ulp(p*t,t : uip$ _L|[a,p,rrft,(hy+ (;) * Copriu,l;(hy(U, (16)INT.J. NUMER.METH.FLUIDS,VOL U: l37t 1389 1997) (t) 1997 by John Wiley & Sons, Ltd

  • 8/2/2019 Element Analysis for Ocean Tidal

    5/23

    COMPUTATIONOF STORMSURGESAND TIDAL FLOWS r375Step 2u!pu'p!'/'u!Br'p!'/'Il**rorrt/3un-tt/3Ha,((p*'t'(ti"')* ,,u(ffi) u

    -,.,#*O)"u*"'* ,, ,,h;','f,u!p("p*t,, MiB(h*t,t qolW,u*r;!ttt(h,+ (i*t/r) + C,Br,u,p!t/3(hy(;+t/\1,

    : u!pu'p!'t'o,|r*r,r'ri'r'uif'/'H,p,((h*'/'0tt't) r,a(ffi)'u. tStep 3u!p",p!1

    ( ( t " ) ' 1 n r 1 / 2 I-'',\ffi) u +s"1p1uii'/2l' ttsrM!p("p*t Mtf("u*t'' - a,tlB"piru'plt/2(hy (i+rtz, + c,Br,u'p!tl'(hr+ (i+t/\1, (20)

    where superscipt n denotes he value computed at the nth time point and Ar is the time incrementbetween he nth and the (r, + l)th step. The coefficient Mfu expresseshe lumped coefficient andMluis the selective umping coefficient given bya ip :eMlp+( -e)Ma, (21)

    ( l 7)

    ( 18 )

    where e is the selective lumping parameter.3.2. Space-Time Implicit Finite Element Method

    In the implicit implementation a stabilized space-time finite element method is used. Using theconservative variables defined as/ u ' \ ( H \u:l'":):r;',)

    where l:h+C, the variational ormulation of (1) and (2) is written as

    J u.(u*^,H) o*ln.(H) ",#)d0+L.ru.)iKU)l(u);ro.'yI-"(o-)'(H)#o,#*('.,#)]'.'y_,I*,(H)H).n

    : Ir^u.Rdo Lo u*H P'@ 1997 by John Wiley & Sons, Ltd.

    (22)

    INT. J. NUMER.METH. FLUDS, YOL A: l37l-1389 09n\

    R

    See note on next page.

  • 8/2/2019 Element Analysis for Ocean Tidal

    6/23

    r376 K. KASHIYAMA ETALHere U* denotes he weighting function and the integration takesplace over the space-time domain(or its subset eferred to as slab) Q,, its lateral boundary Pn and ts lower spatial boundary O,. Thespace-time terminology is explained in more detail in Reference 18. A, and K,, are the coefficientmatrices of the advective-diffusive system, defined as

    A l :0Uz/H0

    K r r

    K2r :

    ",)" )2U2/H/(yr:,;;, ':,)::,,:,) Az:(,"*:r,\i,:r,)1';)'':(-,1:,,*';,i)(-"J{"I 'i) Kzz:(-:;;'::,::,)R denotes he right-hand-side ector

    . : | -gHo(h(ot loxt ku\r /+ r. t ,ZnJ\ -gHa(/, r 6)l\xz - ft,")zlp k)zlp /(24)

    and H is the natural boundary condition term defined on the subsetof the lateral boundary Pr. Thenotation . . .)| and(. . .), indicate he valuesof a discontinuous ariableas he time r approacheshetemporal slab boundary /, from above and below respectively.

    The first two left-hand-side erms and the entire right-hand side of equation 22) constitute heGalerkin form of the shallow water equations l) and (2). The third term enforces weakly thecontinuity of the solution across the time levels t,. The fourth and fifth terms are the SUPGstabilizationand discontinuity-capturingerrns espectively.For the derivation of the stabilizationcoefficients and d for multidimensionaladvection-diffusivesystemssee e.g. Reference11. Thestabilization erms are integratedover the interior of the space-timeelementsff .The variablesand weighting functions are discretizedusing piecewise inear (in both spaceandtime) interpolation unctionsspaces or all fields.The resultingnon-linearequationsystem s solvedusing the Newton-Raphson algorithm, where at each Newton-Raphson step a coupled inear equationsystem s solved teratively using the GMRES update echnique.

    4. PARALLEL IMPLEMENTATIONFor the explicit algorithm a data-parallelmplementations performedon the Fujitsu APl000, whichis a distributed memory, highly parallel computer that supports he communicationmechanism.Figure I shows he configurationof the APl000 system.The AP1000 consistsof 1024processingelementswhich are called cells, a Sun workstationwhich is called the host and three ndependentnetworkswhich arecalled he T-net, B-net and S-net.Eachcell possessesmemory of 16 MB. Using1024 cells, the peak computational speed reaches 8'53 Gflops. The cells perform parallelcomputation synchronizing all cells and transferring boundary node data to neighbouring cells.The host performs institution of cells' environment,creation of task, transfer of data and observationINT. J.NUMER.METH.FLUIDS. OL A: l37t-1389 1997) O 1997by JohnWiley & Sons, td.

    For the correct forms ofthese six matrices andEqs. (22) and (24), seethe following four pagesextracted fromT.E. Tezduyar, "FiniteElement Methods forFlow Problems withMoving Boundaries andInterfaces",Archives ofComputational Methodsin Engineering, 8 (2001)83-130.

  • 8/2/2019 Element Analysis for Ocean Tidal

    7/23

  • 8/2/2019 Element Analysis for Ocean Tidal

    8/23

  • 8/2/2019 Element Analysis for Ocean Tidal

    9/23

  • 8/2/2019 Element Analysis for Ocean Tidal

    10/23

  • 8/2/2019 Element Analysis for Ocean Tidal

    11/23

    COMPUTATIONOF STORMSURGESAND TIDAL FLOWS 1377

    T_NetFigure l. APl000 system

    of cells' condition. Al l cells are connected by the T-net (torus network) for one-to-onecommunicationbetween ells. The host and cells are connected y the B-net (broadcasting etwork)for broadcast ommunication,distributionand collection of data and by the S-net (synchronizationnetwork) or barrier synchronization. he communicationand synchronizationmentionedabovecanbe realizedusing the vendor-supplied arallel ibrary.re

    To minimize the amount of interprocessor ommunication, he automatic mesh decomposerpresented y Farhat20s employed.For eachsubdomain heprocessor ssociated ith that subdomaincarriesout computationsndependently, xchanging nly the subdomain oundarydatawith the otherprocessors.

    The finite elementequationcan be expressed sM X : F , (2s)

    whereM is the lumped massmatrix, X is the unknown vector and F is the known vector. Figure 2showsan examplemesh,with the broken ine denoting he boundaryof a subdomain.Elements 1)-(4) belong o domain 1 (processor ) and elements 5) and (6) belong o subdomain2 (processor ).The unknown values X are solved by

    X : F / M (26)No interprocessor ommunication s needed o compute the unknown values of a node which islocated n the subdomain nterior, such as node A. However, n the caseof node B, which is locatedon the boundary of subdomains, nterprocessor ommunication is neededand the following procedureis applied.First the following valuesare computed n eachprocessor:

    Mst - Mw! * M"rot, Fer : Fe(r)* Fs(+) (processor),Msz : MB6 + MB6), Fsz Fs(s) Fs(o) (processor ).

    Next thesevalues are gatheredusing the communication library, then the unknown values of node Bcan be obtained byXs : (Fil + FB)l(Mil * Ms). (2e)

    Data transfer is performed at every time step (seeFigure 2). As the lumped mass matrix M remainsconstant throughout all time step, the data transfer of that matrix is required only once.The implicit algorithm s implementedon the ConnectionMachine CM-5. Similarly to the FujitsuAP1000, he CM-5 is also a distributedmemory,parallelmachine,with a singlepartitionsizeof up to

    (27)(28)

    @ 1997 by John Wiley & Sons, Ltd. INT. J. NUMER. METH. FLUIDS. YOL A: l37l-1389 (1997)

  • 8/2/2019 Element Analysis for Ocean Tidal

    12/23

    t378 K. KASHIYAMAET AL.,YK'A . B . . B\N*W\ \q r, /

    Figure 2. Parallel implementation512 processing lements PEs)and a Sun multiprocessor ost machine.The PEsare nterconnectedthrough at-treedata,control and diagnosticnetworks.Each PE manages 2 MB of memory andhasa peak processing peedof 128 Mflops, for a total peakof over 65 Gflops.As on the AP1000,highlyoptimized communicationutilities are available, grouped in the Connection Machine ScientificSoftware Library (CMSSL). The implementationof the implicit algorithm described n Section 3follows closely the finite element mplementationof the Navier-Stokesequationswhich have beendescribed n References 1 and 22.

    5. NUMERICAL EXAMPLESAs an applicationof the three-step xplicit algorithm,simulationof the stormsurge n Ise-Bay,Japanaccompanyinghe Ise-Bay yphoon n 1959 s carried out. This typhoonoccurredon 22 September1959 and was the greatestdisasterever to hi t the Ise-Bay district. Over 5000 people were killedbecauseof this storm surge.Figure 3 shows the configurationof the domain and the path of thetyphoon. Figure 4 shows he finite elementdiscretizationused.The total numbersof elementsandnodesare 206,917 and 106,5'71 espectively.This mesh is designed o keep the element Courantnumberconstant n the entiredomain.23'2oigure 5 shows he water depthdiagram.From Figures4

    Figure 3. Computational domain and path of Ise-Bay typhoon

    INT. J. NUMER.METH.FLUIDS,YOL?4: 1371-1389t997) @ 1997 by John Wiley & Sons, Ltd

  • 8/2/2019 Element Analysis for Ocean Tidal

    13/23

    COMPUTATIONOF STORMSURGESAND TIDAL FLOWS

    Figure 4. Finite elementdiscretization

    Figure 5. Water depth diagram (contours are evenly spaced at 50O m intervals)

    1379

    , , . \ , \I "^-:,-

    ,-\\ 1 ^v x;l>:, t n . - ^i+s9

    I\ s ,

    -

    )'rl-)r'lJ./^--,. 2 f f i a) t!t@,>._ t///,^

    r) ;^( \ . i l \ \ r, / l \ \ l ' { }'r1!;(4 ( @.INT. J. NUMER.METH. FLUIDS.YOL 24: 1371-13891997); 1997 by John Wiley & Sons, Ltd.

  • 8/2/2019 Element Analysis for Ocean Tidal

    14/23

    1380 K. KASHIYAMAET AL.

    Figure 6. Finite element discretization around Ise-Bay

    and 5 it can be seen that an appropriate mesh in accordance with the variation in water depth isrealized.Figures6 and 7 show the finite element discretization and water depth diagram aroundIse-Bay respectively.A fine meshwhich representshe geometryaccuratelys employed.Figure 8 showsthe meshpartitioning or 512processors. he typhoon data suchas ts position,speedandpower aregivenat I h intervals.Using thesedata, he wind velocity can be computedat every ime step.Linearinterpolation is used for the data interpolation. For the boundary condition the no-slip bound-ary condition is applied to the coastlineand the open-boundary ondition is applied to the openboundary. For the numerical condition the following data are used: n : 0.3,At :l0 m2 s-l , Cr : Cz: 0.6,R : 500 km, ro : 60 km. The selective umping parameter nd the timeincrementare assumed o be 0.9 and 6 s respectively.Figure 9 shows he path of the typhoons; henumerals enote he time andpositionof the typhoon.Figure l0 shows he computedwaterelevation

    Figure 7. Water depth diagram around Ise-Bay (contours are evenly spaced at l0 m intervals)

    AW#INT. J. NUMER.METH.FLUIDS,YOL24: r37t 1389 1997) O 1997by JohnWiley & Sons, td.

  • 8/2/2019 Element Analysis for Ocean Tidal

    15/23

    COMPUTATIONOF STORMSURGESAND TIDAL FLOWS 1381

    Figure 8. Mesh panitioning for 512 processors

    at times 17:00 and 24:00. Figure l1 shows he computedwater elevationat t h intervals. t can beseen hat the water elevationvariesaccording o the movementof the typhoon.Figure 12 shows hecomputed current velocity at time 22:00 and the complicated low pattern.Figure 13 shows thecomparisonof water elevation between he computed and observed esults2s t Nagoya. t can be seenthat the computed esultsare in good agreementwith the observed esults.

    l 7t 6

    l 5t (

    r3 (hour)

    Figure 9. Path of typhoon

    (O 1997 by John Wiley & Sons, I.td. INT.J. NUMER.METH.FLUIDS.VOt-24: 37l-1389 1997)

  • 8/2/2019 Element Analysis for Ocean Tidal

    16/23

    t382 K. KASHIYAMAET AL

    Figure 10. Computed water elevation (contours are evenly spaced at 0 1 m intervals)

    In order to check the performanceof the parallelization, three finite elementmeshesare employed:meshL with206,977 elementsand 106,577 odes,meshM with 133,546 lements nd 69,295nodesandmeshS with 76,497elementsand4A,197 odes.Figures14 and 15 show he relationbetween henumber of processorsand the speed-up atio and efficiency of parallelization respectively. In thesefigures the speed-up atio and efficiency can be defined asspeed-up atio : comnutational ime for one PEcomputational ime for N PEs

    speed-up atio(30)(3 )ff iciency: N

    where N denotes the total number of processors.From these figures it can be seen that theperformance s improved in accordancewith an increase n the degreesof freedom and the efficiencyis decreasedn accordance ith an increase n processors.n the caseof the computationusing meshINT.J. NUMER.METH.FI-UIDS, OL 24: 37l-.13891997) Q 1997 by John Wiley & Sons, Ltd.

  • 8/2/2019 Element Analysis for Ocean Tidal

    17/23

    COMPIJTATION OF STORM SURGES AND TIDAL FLOWS I383

    Figure I L Computed water elevation at I h intervals (contours are evenly spaced at 0.1 m intervals)

    L and 512 processors, t can be seen hat the speed-up atio and efficiency reach approximately 400and80Vo espectively.As an applicationof the stabilizedspace-time ormulation, he tidal flow in Tokyo Bay has beensimulated. This problem was analysedearlier using the three-stepexplicit scheme described nSection3.26Here we carry out the simulationusing he mplicit formulation ntroduced n Section3.The meshused n the computationconsistsof 56,893 elementsand 60,210 space-timenodes,asshown n Figure 16.The meshhasbeendecomposednto 256 subdomainswhich areassignedo theindividual CM-5 vectorunits)using a recursivespectralbisectionalgorithm,as shown n Figure 17.The mesh refinement s related o the water depth, shown magnified 100-fold n Figure 18. In thissimulationa time stepsize of 60 s is chosenand he total duration s 1600 ime steps, pproximating

    @ 1997by JohnWiley & Sons, td. INT.J. NUMER.METH.FLUIDS.YOLA: t37t-r389 (1997)

  • 8/2/2019 Element Analysis for Ocean Tidal

    18/23

    I 384 K. KASHIYAMAET AL-> : 10 m/sec

    Figure 12. Computed current velocity at time 22:00

    - computed esult---- Observedesult

    Figure 13. Comparison between computed and observedwater elevation at Nagoya

    E.9o-go6=

    -o)oo

    [:]-sool-i MESHL| | - * . MESHMoool_l_. MESHS , / -| , / . . - '3001 - . / , - - '200-foo y'4:'' -l

    1 100 200 300 400 500Number of processors

    Figure 14. Comparison of speed-up atios

    INT. J. NUMER.METH.FLUIDS,YOI-A: l37l--13891997) @ 1997 by John Wiley & Sons, Ltd

  • 8/2/2019 Element Analysis for Ocean Tidal

    19/23

    COMPUTATIONOF STORMSURGESAND TIDAL FLOWS

    1 100 200 300 400 500Number f processors

    Figure 5. Comparisonf efficiencies

    Fisure 16. Finite elementdiscretization f Tokvo Bav

    Figure 17. Mesh partitioningfot 256 processors

    1385

    ;eo:suJ

    @ 1997 by John Wiley & Sons, Ltd. INT. J. NUMER. METH. FLUIDS,YOL A: 137r-1389 1997)

  • 8/2/2019 Element Analysis for Ocean Tidal

    20/23

    1386 K. KASHIYAMAET AL.

    Figure 18.Water depthview of Tokyo Bay

    Figure 20 . Computedwater elevationat r:18:00 htNT. J. NUMER. METH. FLUIDS,YOL A: 137l-1389 1997) @ 1997by JohnWiley & Sons, td.

    Figure 19. Computedwater elevationat l: l5:00 h

  • 8/2/2019 Element Analysis for Ocean Tidal

    21/23

    COMPIJ-|ATIONOF STORMSURGESAND TIDAL FLOWS 1387

    Figure 2l . Computed water elevation at / : 2l :00 h

    Figwe 22. Computed water elevation at t: 24:00 h

    one 24 h period. At the ocean boundary a diurnal tidal wave is imposed with an amplitude of 0.5 mand a periodof l2h. The following parameters re used:n:0'03, Ar:5 m2 s-r, C1 C, - Q.The storm surge erm (3) is ignored in this problem. The resulting elevation is shown in Figures 19-22,magnified50,000 imes with respect o the horizontaldimensions, t timesr: l5:00, l8:00, 2l:00and 24:00 h into the simulation respectively.The simulation was performedon a 64-node CM-5 with256 vector units and took 8.5 h of computer ime to complete.

    6. CONCLUDING REMARKSA three-stepexplicit finite element solver and an implicit stabilized space-time formulation of theshallow water equations, applicable to unstructured mesh computations of storm surges and tidalflows, have been successfully mplementedon the massivelyparallel supercomputers Pl000 andCM-5 respectively. The explicit method has been applied to the analysis of the storm surgeaccompanying the Ise-Bay typhoon in 1959. The efficiency of the parallelization has beeninvestigated and the computed results have been compared with the observed results. Theperformanceand efficiency were observed o improve linearly in accordancewith an increase n thenumber of degreesof freedom.The implicit method hasbeen used o compute he tidal flow in TokyorQ 1997 by John Wiley & Sons, Ltd. INT. J. NUMER.METH.FLUIDS.YOL A: r37t-t389(1997)

  • 8/2/2019 Element Analysis for Ocean Tidal

    22/23

    I 388 K. KASHIYAMAET AL.Bay. From the results obtained in this paper, t can be concluded that the presentedmethod can besuccessfully pplied o large-scale omputations f storm surgesand tidal flows.

    ACKNOWLEDGEMENTSWe are grateful to the Parallel Computing ResearchCenter of Fujitsu Laboratory for providing uswith accesso the APl000 resources.The last two authors were supportedby ARPA and by the ArmyHigh PerformanceComputing ResearchCenter under the auspicesof the Departmentof the Army,Army ResearchLaboratory co-operative agreement number DAAH04-95-2-0003/contract numberDAAH04-95-C-0008.The contentdoes not necessarilyeflect he positionor the policy of the U.S.Government,and no official endorsementshould be inferred. The CRAY C90 time was provided bythe Minnesota Supercomputer nstitute.

    REFERENCES1. J.J.Wester ink,R.A.Luet t ich,A.M.Bapt isa,N.W.Shef fnerandP.Farrar , 'T ideandstormsurgepredict ionsusingfin i teelementmodel', J. Hydraul. Eng.,118,1373-1390 (1992).2. K. Kashiyama, K. Saito and S. Yoshikawa, 'Massively parallel finite element method for large-scalecomputation of stormsurge' ,Proc. l th lnt .Conf .onComputat iornlMethodsinWaterResources,Cancum, 996.3. K. Kashiyama,H. Ito, M. Behr and T. Tezduyar, 'Three-stepexplicit finite element computation of shallow water flows ona massivelyparallelcomputer', nt. . numer. methodsluids,2l,885-900 (1995).4. M. Kawahara, H. Hirano, K. Tsubotaand K. Inagaki, 'Selective lumping finite element method for shallow water flow',Inr.j. numer. methodsluids,2,89-112 (1982).5. T. E. Tezduyar, M. Behr, S. Mittal and A. A. Johnson, 'Computation of unsteady ncompressible flows with the finiteelement methods-space-time formulations, iterative strategiesand massively parallel implementations', in P. Smolinksi,W. K. Liu, G. Hulbert and K. Tamma (eds) New Methods n Transient Analysis, AMD Vol. 143, ASME, New York, 1992,pp.1-24.6. T. J. R. Hughesand A. N. Brooks, A multi-dimensional pwind schemewith no cross-wind iffusion', in T. J. R. Hughes(ed.), Finite Element Methodsfor Convection Dominated F/ows, AMD Vol. 34, ASME, New York, 1979, pp. l9-35.7. T. E. Tezduyar and T. J. R. Hughes, Finite element formulations for convection dominated flows with particular emphasison the compressible uler equations',AIM Paper 83-0125,1983.8. G. J. Le Beau and T. E. Tezduyar, 'Finite element computation of compressible lows with the SUPG formulation', in M.N. Dhaubhadel,M. S. Engelmanand J. N. Reddy (eds),Advances n Finite Element Analysis n Fluid Dynamics, FED Vol.123,ASME, New York, 1991,pp.21-27.9. G. J. Le Beau, S. E. Ray, S. K. Aliabadi and T. E. Tezduyar, 'SUPG finite element computation of compressible lows withthe entropy and conservation variables formulations', Comput. Methods Appl. Mech. Eng., tM,397422 (1993).10. S. Aliabadi, S. E. Ray and T. E. Tezduyar, 'SUPG finite element computation of compressible fows with the entropy andconservation variables formulations', Comput. Mech., ll, 300-3 12 (1993).11. S. Aliabadi and T. E. Tezduyar, 'Space-time finite element computation of compresssible flows involving movingboundaries and interfaces', Comput. Methods Appl. Mech. Eng., 107, 209-224 (1993).12. C. Johnson,U. Navert and J. Pitkiiranta, 'Finite element methods for linear hyperbolic problems', Comput. Methods Appt.Mech. Ens., 45, 285-312 (1984).13. T. J. R. Hughes and G. M. Hulbert, 'Space-time finite element methods for elastodynamics: formulations and errorestimates',Comput.MethodsAppl. Mech. Eng., ffi,339-363 (1988).14. T. E. Tezduyar, M. Behr and J. Liou, 'A new strategy for finite element computations involving moving boundariesandinterfaces-the deforming-spatial-domain/space-time procedure: I. The concept and the preliminary tests', Comput.

    Methods Appl. Mech. Eng.,94,339-351 (1992).15. M. Miyazaki, T. Ueno and S. Unoki, 'Theoretical investigations of typhoons surges along the Japanesecoast (II)',Oceanogr.Mag., 13, 103-117 (1962).16. C. B. Jiang, M. Kawahara,K. Hatanakaand K. Kashiyama, 'A three-step inite element method for convection-dominatedincompressible flows', Comput. Fluid Dyn. J.,l, 443462 (1993).17. M. Kawahara and K. Kashiyama, 'Selective lumping finite element method for nearshorecurrent', Int. j. numer. methodsfluids, 4,'7 1-9'7 (1984).18. M. Behr and T. E. Tezduyar, 'Finite element solution strategies or large-scale low simulations', Comput. Methods Appl.Mech. Eng., ll2,3-24 (1994).19. APIUD Library Manual, Fujitsu Laboratories, Tokyo, 1993.20. C. Farhat, 'A simple and efficient automatic FEM domain decomposer', Comput. Struct.,/8,57ffi02 (1988).

    INT.J. NUMER.METH.FLUIDS, OL A: l37l-r389 (1997) @ 1997 by John Wiley & Sons, Ltd.

  • 8/2/2019 Element Analysis for Ocean Tidal

    23/23

    COMPUTATIONOF STORMSURGESAND TIDAL FLOWS l 3892l . M. Behr, A. Johnson,J. Kennedy, S. Mittal and T. E. Tezduyar, 'Computation of incompressible lows with implicit finiteelement mplementation n the ConnectionMachine', Comput.MethodsAppl. Mech. Eng., l0E,99-118 (1993).22. J. G. Kennedy, M. Behr, V. Kalro and T. E. Tezduyar, 'lmplementation of implicit finite element methods forincompressiblelows on the CM-5', Comput.MethodsAppl. Mech. Eng., l9,95-1 l l (1994).23. K. Kashiyama and T. Okada, 'Automatic mesh generationmethod for shallow water flow analysis', Int. j. numer. methods

    fluids, 15, 1037 1057 (1992).24. K. Kashiyama and M. Sakuraba, 'Adaptive boundary-type finite element method for wave diffraction-refractron rnharbors',Comput.MethodsAppl. Mech. Eng., l2,185-197 (1994).25 . 'Reportson Ise-Bay yphoon', Tech.Rep. 7, MeteorologicalAgency, 1961 (in Japanese).26. K. Kashiyama, H. Ito, M. Behr and T. E. Tezduyar, 'Massively parallel finite element strategies or large-scalecomputationof shallow water flows and contaminant transport', ExtendedAbstr. SecondJapan-U.S. Symp.on Finite Element Methodsin Large-Scale Computational Fluid Dynamics, Tokyo, 1994.

    Q 1997 by John Wiley & Sons, Ltd INT.J. NUMER.METH.FLUIDS.YOL2,4:137l-13891997)