high performance computing at scec

11
1 High Performance Computing at SCEC Scott Callaghan Southern California Earthquake Center University of Southern California

Upload: ellery

Post on 10-Feb-2016

29 views

Category:

Documents


0 download

DESCRIPTION

High Performance Computing at SCEC. Scott Callaghan Southern California Earthquake Center University of Southern California. Why High Performance Computing?. What is HPC? Using large machines with many processors to compute quickly Why is it important? - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: High Performance Computing at SCEC

1

High Performance Computing at SCEC

Scott CallaghanSouthern California Earthquake CenterUniversity of Southern California

Page 2: High Performance Computing at SCEC

2

Why High Performance Computing?

• What is HPC?– Using large machines with many processors to compute

quickly• Why is it important?

– Only way to perform large-scale simulations• Two main types of HPC SCEC projects

– What kind of shaking will this eq cause in a region?– What kind of shaking will this single location experience?

Page 3: High Performance Computing at SCEC

3

SCEC Scenario Simulations• Simulations of individual earthquakes

– Determine shaking over a region caused by a single event (usually M > 7)

Peak ground velocities for a Mw8.0 Wall-to-Wall Scenario on the San Andreas Fault (1Hz) calculated using AWP-ODC on NICS Kraken.

Page 4: High Performance Computing at SCEC

4

Simulating Large Events• Must break up the work into pieces

– Most commonly, spatially– Give work to each processor– Run a timestep– Communicate with neighbors– Repeat

• As number of processors increases, harder to get good performance

Page 5: High Performance Computing at SCEC

5

Probabilistic Seismic Hazard Analysis• Builders ask seismologists: “What will the peak

ground motion be at my new building in the next 50 years?”

• Seismologists answer this question using Probabilistic Seismic Hazard Analysis (PSHA)– PSHA results used in building codes, insurance– California building codes impact billions of dollars of

construction yearly

Page 6: High Performance Computing at SCEC

6

PSHA Reporting• PSHA information is relayed through

– Hazard curves (for 1 location)– Hazard maps (for a region)

Probability of exceeding 0.1g in 50 yrsCurve for downtown LA

2% in 50 years

0.6 g

Page 7: High Performance Computing at SCEC

77

PSHA Methodology

1. Pick a location of interest.2. Define what future earthquakes might happen.3. Estimate the magnitude and probability for each

earthquake, from earthquake rupture forecast (ERF)4. Determine the shaking caused by each earthquake at the

site of interest.5. Aggregate the shaking levels with the probabilities to

produce a hazard curve.

Repeat for multiple sites for a hazard map.Typically performed with attenuation relationships.

Page 8: High Performance Computing at SCEC

8

CyberShake Approach

• Uses physics-based approach– 3-D ground motion simulation with anelastic wave propagation– Considers ~415,000 rupture variations per site

• 7000 ruptures in ERF• <200 km from site of interest• Magnitude >6.5• Add variability

– More accurate than traditional attenuation methods

• 100+ sites in Southern California needed to calculate hazard map

LADT: Probability of Exceedance (SA 3.0) Blue and Green – common attenuation relationshipsBlack – CyberShake

Page 9: High Performance Computing at SCEC

9

Results

Attenuation map CyberShake map

Page 10: High Performance Computing at SCEC

10

Results (difference)

CyberShake map compared to attenuation map

Population Density

Page 11: High Performance Computing at SCEC

11

Some recent numbers

• Wall-to-wall simulation– 2 TB output– 100,000 processors

• CyberShake– Hazard curves for 223 sites– 8.5 TB output files– 46 PB of file I/O– 190 million jobs executed– 4500 processors for 54 days