Transcript
Page 1: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

• Solving Einstein’s Equations, Black Holes, and Gravitational Wave Astronomy

• Cactus, a new community simulation code framework– Toolkit for many PDE systems

– Suite of solvers for Einstein and astrophysics systems

• Recent Simulations using Cactus– Black Hole Collisions, Neutron Star Collisions

– Collapse of Gravitational Waves

– Aerospace test project

• Metacomputing for the general user: what a scientist really wants and needs– Distributed Computing Experiments with

Cactus/Globus

Ed SeidelAlbert-Einstein-InstitutMPI-Gravitationsphysik& NCSA/U of IL

Ed SeidelAlbert-Einstein-InstitutMPI-Gravitationsphysik& NCSA/U of IL

Page 2: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Einstein’s Equations and Gravitational Waves• Einstein’s General Relativity

– Fundamental theory of Physics (Gravity)– Among most complex equations of physics

• Dozens of coupled, nonlinear hyperbolic-elliptic equations with 1000’s of terms• Barely have capability to solve after a century

– Predict black holes, gravitational waves, etc.

• Exciting new field about to be born: Gravitational Wave Astronomy– Fundamentally new information about Universe– What are gravitational waves??: Ripples in spacetime curvature, caused by matter motion,

causing distances to change:

• A last major test of Einstein’s theory: do the exist?– Eddington: “Gravitational waves propagate at the speed of thought”– 1993 Nobel Prize Committee: Hulse-Taylor Pulsar (indirect evidence)– 20xx Nobel Committee: ??? (For actual detection…)

s(t) h = s/s ~ 10-22 ! Colliding BH’s and NS’s...

Page 3: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Teraflop Computation, AMR, Elliptic-Hyperbolic, ???

Numerical Relativity

Waveforms: We Want to Compute What Happens in Nature...

PACSVirtual Machine Room

Page 4: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Black Holes: Excellent source of waves

• Need Cosmic Cataclysms to provide strong waves!

• BH’s have very strong gravity, collide near speed of light, have ~3-100+ solar masses!

• May collide “frequently”– Not very often local region of space, but..

– Perhaps ~3 per year within 200Mpc, range of detectors…

• Need to have some idea what the signals will look like if we are to detect and understand them…

Page 5: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Einstein Equations: New Formulations, New Capabilities

• Einstein Eqs.: G(ij) = 8T

• Traditional Evolution Equations: ADM– ∂tt = S() (Think Maxwell: ∂E/ ∂t = Curl B, ∂B/∂t = - Curl E)

• S() has thousands of terms (very ugly!)

– 4 nonlinear elliptic constraints (Think Maxwell: Div B = Div E = 0)

– 4 gauge conditions (often elliptic) (Think Maxwell: —– Numerical Methods “ad hoc”. Not manifestly hyperbolic

• NEW: First Order Symmetric Hyperbolic

∂tu+ ∂iFi(u)= S(u)– u is a vector of many fields, typically of order 50

– Complete set of Eigenfields (under certain conditions…)

– Many variations on these formulations, dozens of papers since 1992

– Elliptic equations still there…

Page 6: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Computational Needs for 3D Numerical Relativity• Explicit Finite Difference Codes

– ~ 104 Flops/zone/time step

– ~ 100 3D arrays

• Require 10003 zones or more– ~1000 Gbytes

– Double resolution: 8x memory, 16x Flops

• TFlop, Tbyte machine required

• Parallel AMR, I/O essential

• A code that can do this could be useful to other projects (we said this in all our grant proposals)!– Last 2 years devoted to making this useful

across disciplines…

– All tools used for these complex simulations available for other branches of science, engineering...

•InitialData: 4 coupled nonlinear elliptics•Evolution

• hyperbolic evolution• coupled with elliptic eqs.

t=0

t=100

Page 7: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Any Such Computation Requires Incredible Mix of Varied Technologies and Expertise!

• Many Scientific/Engineering Components– Physics, astrophysics, CFD, engineering,...

• Many Numerical Algorithm Components– Finite difference methods? Unstructured meshes?

– Elliptic equations: multigrid, Krylov subspace, preconditioners,...

– Mesh Refinement?

• Many Different Computational Components– Parallelism (HPF, MPI, PVM, ???)

– Architecture Efficiency (MPP, DSM, Vector, PC Clusters, ???)

– I/O Bottlenecks (generate gigabytes per simulation, checkpointing…)

– Visualization of all that comes out!

• Scientist wants to focus on top bullet, but all required for results...

Page 8: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

This is fundamental question addressed by Cactus.

• Clearly need teams, with huge expertise base to attack such problems...

• In fact, need collections of communities to solve such problems...

• But how can they work together effectively?

• We need a simulation code environment that encourages this...

These are the fundamental issues addressed by Cactus.• Providing advanced comp. Science to scientists/engineers• Providing collaborative infrastructure for large groups

Page 9: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Grand Challenges : NSF Black Hole and NASA Neutron Star Projects

• University of Texas (Matzner, Browne), • NCSA/Illinois/AEI (Seidel, Saylor, Smarr,

Shapiro, Saied)• North Carolina (Evans, York)• Syracuse (G. Fox)• Cornell (Teukolsky)• Pittsburgh (Winicour)• Penn State (Laguna, Finn)

•NCSA/Illinois/AEI (Saylor, Seidel, Swesty, Norman)•Argonne (Foster)•Washington U (Suen)•Livermore (Ashby)•Stony Brook (Lattimer)

NEW!EU Network

Page 10: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

What we learn from Grand Challenges

• Successful, but also problematic…– No existing infrastructure to support collaborative HPC

– Many scientists are bad Fortran programmers, and NOT computer scientists (especially physicists…like me…)

– Many sociological issues of large collaborations and different cultures

– Many language barriers...

– Applied mathematicians, computational

scientists, physicists have very different concepts

and vocabularies…

– Code fragments, styles, routines often clash

– Successfully merged code (after years) often impossible to transplant into more modern infrastructure (e.g., add AMR or switch to MPI…)

• Many serious problems...

Parlez-vous C? Nein! Nur Fortran!

Page 11: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Cactusnew concept in community developed simulation code infrastructure

• Developed as response to needs of these projects

• Numerical/computational infrastructure to solve PDE’sFreely available, open community source code: spirit of gnu/linux

• Cactus Divided in “Flesh” (core) and “Thorns” (modules or collections of subroutines)– User apps can be Fortran, C, C++; automated interface between them

– Parallelism abstracted and hidden (if desired) from user

– User specifies flow: when to call thorns; code switches memory on/off

• Many parallel utilities / features enabled by Cactus

• (nearly) All architectures supported: – Dec Alpha / SGI Origin 2000 / T3E / Linux clusters + laptops / Hitachi

/NEC/HP/Windows NT/ SP2, Sun

– Code portability, migration to new architectures very easy!

Page 12: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Modularity of Cactus...

Application 1a

Cactus Flesh

Application 2 ...

Sub-app

AMR (Grace, etc)

MPI layer 1 I/O layer 2

Remote Steer 3

Globus Metcomputing Services

User selectsdesired functionality...

Application 1b

Page 13: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Computational Toolkit: provides parallel utilities (thorns) for computational scientist

• Cactus is a framework or middleware for unifying and incorporating code from Thorns developed by the community– Choice of parallel library layers (Native MPI, MPICH, MPICH-G(2), LAM,

WMPI, PACX and HPVM)

– Portable, efficient (T3E, SGI, Dec Alpha, Linux, NT Clusters…)

– 3 mesh refinement schemes: Nested Boxes, GrACE, HLL (coming…)

– Parallel I/O (Panda, FlexIO, HDF5, etc…)

– Parameter Parsing

– Elliptic solvers (Petsc, Multigrid, SOR, etc…)

– Visualization Tools, Remote steering tools, etc…

– Globus (metacomputing/resource management)

– Performance analysis tools (Autopilot, PAPI, etc…)

– INSERT YOUR CS MODULE HERE...

Page 14: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

PAPI

• Standard API for accessing the hardware performance counters on most microprocessors.

• Useful for tuning, optimization, debugging, benchmarking, etc.

http://icl.cs.utk.edu/projects/papi/http://www.cactuscode.org/Documentation/HOWTO/Performance-HOWTOhttp://www.cactuscode.org/Projects.html

• Java GUI available for monitoring the metrics

• Cactus thorn CactusPerformance/PAPI

Page 15: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

GrACE

• Parallel/distributed AMR via C++ library

• Abstracts Grid Hierarchies, Grid Functions and Grid Geometries

• CactusPAGH will include a driver thorn which uses GrACE to provide AMR (KDI ASC Project)

http://www.caip.rutgers.edu/~parashar/TASSL/Projects/GrACE/index.htmlhttp://www.cactuscode.org/Workshops/NCSA99/talk23/index.htm

Page 16: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

How to use Cactus: Avoiding the MONSTER code syndrome...

• [Optional: Develop thorns, according to some rules– e.g. specify variables through interface.ccl

– Specify calling sequence of the thorns for given problem and algorithm

(schedule.ccl)]

• Specify which thorns are desired for simulation (Einstein equations + special method 1 +HRSC hydro+wave finder + AMR + live visualization module + remote steering tool…)

• Specified code is then created, with only those modules, those variables, those I/O routines, this MPI layer, that AMR system,…, needed

• Subroutine calling lists generated automatically

• Automatically created for desired computer architecture

• Run it…

Page 17: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Cactus Computational Tool Kit• Flesh (core) written in C

• Thorns (modules) grouped in packages written in F77, F90, C, C++

• Thorn-Flesh interface fixed in 3 files written in CCL (Cactus Configuration Language):– interface.ccl: Grid Functions, Arrays, Scalars (integer, real, logical, complex)– param.ccl: Parameters and their allowed values– schedule.ccl: Entry point of routines, dynamic memory and communication

allocations

• Object oriented features for thorns (public, private, protected variables, implementations, inheritance) for clearer interfaces

• Compilation: – PERL parses the CCL files and creates the flesh-thorn interface code at compile time– Particularly important for the FORTRAN-C interface. FORTRAN arg. lists must be

known at compile time, but depend on the thorn list

Page 18: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

High performance: Full 3D Einstein Equations solved on NCSA NT Supercluster, Origin 2000, T3E

Cactus Scaling on T3E-600

192

760

5980

47900

100

1000

10000

100000

1 10 100 1000

Number of Processors

Cactus on T3E 600 Total Mflops/sec

• Excellent scaling on many architectures– Origin up to 256 processors

– T3E up to 1024

– NCSA NT cluster up to 128 processors

• Achieved 142 Gflops/s on 1024 node T3E-1200 (benchmarked for NASA NS Grand Challenge)

• But, of course, we want much more… metacomputing, meaning connected computers...

Page 19: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

“Egrid”

NCSA

Cactus Development Projects

AEI Cactus Group(Allen)

NASA “Round 2”(Saylor)

Round 3??

NSF KDI(Suen)

EU Network(Seidel)

Numerical RelativityAstrophysics

Grid Forum

DLR

Geophysics

DFN Gigabit(Seidel)

Microsoft

“GRADS”

Page 20: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Applications

• Neutron Stars– Developing capability to do full GR hydro

– Now can follow full orbits!

• DLR project: working to explore capabilities for aerospace industry

• Black Holes (prime source for GW)– Increasingly complex collisions: now doing

full 3D grazing collisions

• Gravitational Waves– Study linear waves as testbeds

– Move on to fully nonlinear waves

– Interesting Physics: BH formation in full 3D!

Page 21: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Evolving Pure Gravitational Waves• Einstein’s equations nonlinear, so low amplitude waves just propagate

away, but large amplitude waves may…– Collapse on themselves under their own self-gravity and actually form black holes

• Use numerical relativity: Probe GR in highly nonlinear regime– Form BH?, Critical Phenomena in 3D?, Naked singularities?

– … Little known about generic 3D behavior

• Take “Lump of Waves” and evolve– Large amplitude: get BH to form!

– Below critical value: disperses and can evolve “forever” as system returns to flat space

• We are seeing hints of critical phenomena, known from nonlinear dynamics

Page 22: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Comparison: sub vs. super-critical solutions

Newman-Penrose 4 (showing gravitational waves)with lapse underneath

QuickTime™ and aMotion JPEG A decompressor

are needed to see this picture.

QuickTime™ and aMotion JPEG A decompressor

are needed to see this picture.

Subcritical: no BH forms

Supercritical: BH forms!

Page 23: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Numerical Black Hole Evolutions• Binary IVP: Multiple Wormhole Model,

other models• Black Holes good candidates for

Gravitational Waves Astronomy– ~ 3 events per years within 200Mpc– But what are the waveforms?

• GW astronomers want to know!

S1 S2

P1

P2

QuickTime™ and aVideo decompressor

are needed to see this picture.

Page 24: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Now try first 3D “Grazing Collision”: Big Step: Spinning, “orbiting”, unequal mass BHs merging.

Evolution of 4 inx-z plane (rotation plane of BH)

QuickTime™ and aMotion JPEG A decompressor

are needed to see this picture.Horizon merger

Alcubierre et alresults

3843, 100GB simulation,Largest production relativitySimulation256 processor Origin 2000

Page 25: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Our Team Requires Grid Technologies, Big Machines for Big Runs

WashU

NCSA

Hong Kong

AEI

ZIB

Thessaloniki

How Do We:• Maintain/develop Code?• Manage Computer Resources?• Carry Out/monitor Simulation?

Paris

PACSVirtual Machine Room

Page 26: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

• Cactus Portal, Distributed Simulation under active development at NASA-Ames

• Deutsches Luft- und Raumfahrtzentrum (DLR) Pilot Project– a CFD code (Navier-Stokes with Turbulence model or Euler) with

special extension to calculate turbine streams. Can be used for "normal" CFD problems as well.

– based on finite volume discretization on a block structured regular cartesian grid.

– has currently simple MPI parallelization.

– Plugging into Cactus to evaluate

Aerospace Applications

Page 27: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

What we need and want in simulation science: a Portal to provide the following...

• Got an idea? Write Cactus module, link to other modules, and…• Find resources

– Where? NCSA, SDSC, Garching, Boeing…???– How many computers? Distribute Simulations?– Big jobs: “Fermilab” at disposal: must get it right while the beam is on!

• Launch Simulation– How do get executable there?– How to store data?– What are local queue structure/OS idiosyncracies?

• Monitor the simulation– Remote Visualization live while running

• Limited bandwidth: compute viz. inline with simulation• High bandwidth: ship data to be visualized locally

– Visualization server: all privileged users can login and check status/adjust if necessary• Are parameters screwed up? Very complex!• Call in an expert colleague…let her watch it too

• Steer the simulation– Is memory running low? AMR! What to do? Refine selectively or acquire additional

resources via Globus? Delete unnecessary grids? • Postprocessing and analysis

– 1TByte output at NCSA, research groups in St. Louis and Berlin…how to deal with this?

Page 28: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Cactus Computational Toolkit

Science, Autopilot, AMR, Petsc, HDF, MPI, GrACE, Globus, Remote Steering...

A Portal to Computational Science: The Cactus Collaboratory

1. User has scienceidea...

3. Selects Appropriate Resources...

5. Collaborators log in to monitor...

4. Steers simulation, monitors performance...

2. Composes/Builds Code Components w/Interface...

Want to integrate and migrate this technology to the generic user...

Page 29: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Remote Visualization

IsoSurfaces and Geodesics

Contour plots(download)

Grid FunctionsStreaming

HDF5

Amira

Amira

LCA Vision

OpenDX

Page 30: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Remote Visualization Tools under Development

• Live data streaming from Cactus simulation to viz client– Clients: OpenDX, Amira, LCA Vision, Xgraph

• Protocols– Precomputed Viz run inline with the simulation:

• Isosurfaces, geodesics

– HTTP:

• Parameters, xgraph data, Jpegs, viewed and controlled from any web browser

– Streaming HDF5: sends raw data from resident memory of supercomputer

• HDF5 provides downsampling and hyperslabbing

• all above data, and all possible HDF5 data (e.g. 2D/3D)

• two different technologies– Streaming Virtual File Driver (I/O rerouted over network stream)

– XML-wrapper (HDF5 calls wrapped and translated into XML)

Page 31: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Remote Steering

Remote Viz data

Remote Viz data

XML HTTP

HDF5

Amira

Any Viz Client

Page 32: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Remote Steering

• Stream parameters from Cactus simulation to remote client, which changes parameters (GUI, command line, viz tool), and streams them back to Cactus where they change the state of the simulation.

• Cactus has a special STEERABLE tag for parameters, indicating it makes sense to change them during a simulation, and there is support for them to be changed.

• Example: IO parameters, frequency, fields

• Current protocols:– XML (HDF5) to standalone GUI

– HDF5 to viz tools (Amira, Open DX, LCA Vision, etc)

– HTTP to Web browser (HTML forms)

Page 33: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Remote Offline Visualization

Viz Client (Amira)

HDF5 VFD

DataGrid (Globus)

DPSS FTP HTTP

VisualizationClient

DPSS Server

FTP Server

Web Server Remote

Data Server

Downsampling, hyperslabs

2 TByte at NCSA

Berlin

??

Page 34: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Metacomputing: harnessing power when and where it is needed

• Einstein equations typical of apps that require extreme memory, speed

• Largest supercomputers too small!

• Networks very fast!– vBNS, etc in US

– DFN Gigabit testbed: 622 Mbits Potsdam-Berlin-Garching, connect multiple supercomputers

– International gigabit networking possible

– Connect workstations to make supercomputer

• Acquire resources dynamically during simulation!– AMR, analysis, etc...

• Seamless computing and visualization from anywhere

• Many metacomputing experiments in progress– Current ANL/SDSC/NCSA/NERSC/… experiment in progress...

Page 35: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Metacomputing the Einstein Equations:Connecting T3E’s in Berlin, Garching, San Diego

Want to migrate this technology to the generic user...

Page 36: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Grand Picture

Remote steering and monitoring

from airport

Origin: NCSA

Remote Viz in St Louis

T3E: Garching

Simulations launched from Cactus PortalGrid enabled

Cactus runs on distributed machines

Remote Viz and steering from Berlin

Viz of data from previous simulations in SF café

DataGrid/DPSSDownsampling

Globus

http

HDF5

IsoSurfaces

Page 37: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

The Future

• Gravitational wave astronomy almost here: must be able to solve Einstein’s equations in detail to understand the new observations

• New Codes, strong collaborations, bigger computers, new formulations of EE’s: together enabling much new progress.

• Cactus Computational Toolkit developed orignally for Einstein’s equations, available now for many applications (NOT an astrophysics code!)– Useful as a parallel toolkit for many applications, provides portability from laptop

to many parallel architectures (e.g. cluster of iPaqs!)

– Many advanced collaborative tools, portal for code compostion, resource selection, computational steering, remote viz under development

• Advanced Grid-based metacomputing tools are maturing...

Page 38: Albert-Einstein-Institut  Cactus: Developing Parallel Computational Tools to Study Black Hole, Neutron Star (or Airplane...) Collisions

Albert-Einstein-Institut www.aei-potsdam.mpg.de

Further details...• Cactus

– http://www.cactuscode.org

– http://www.computer.org/computer/articles/einstein_1299_1.htm

• Movies, research overview (needs major updating)– http://jean-luc.ncsa.uiuc.edu

• Simulation Collaboratory/Portal Work: – http://wugrav.wustl.edu/ASC/mainFrame.html

• Remote Steering, high speed networking– http://www.zib.de/Visual/projects/TIKSL/

– http://jean-luc.ncsa.uiuc.edu/Projects/Gigabit/

• EU Astrophysics Network– http://www.aei-potsdam.mpg.de/research/astro/eu_network/index.html


Top Related