scec: an nsf + usgs research center shakealert cisn testing center (ctc) development philip...

Post on 23-Dec-2015

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

SCEC: An NSF + USGS Research Center

ShakeAlert CISN Testing Center (CTC) Development

Philip MaechlingInformation Technology ArchitectSouthern California Earthquake Center (SCEC)14 October 2010

1. Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present).

2. Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system.

3. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts.

4. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event.

5. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps.

6. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

2

CTC Progress in 2010

EEW Testing Center Provides On-going Performance Evaluation

• Performance summaries available through login (www.scec.org/eew)• Evaluation results for 2010 include 144 M4+ earthquakes in CA Testing

Region• Cumulative raw summaries (2008-present) posted at

(scec.usc.edu/scecpedia/Earthquake_Early_Warning)

1. Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present).

2. Changed our automated software testing infrastructure from web-based (Joomla) system to server-based (CSEP) system.

3. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts.

4. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event.

5. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps.

6. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

4

CTC Progress in 2010

Earthquake Catalog

Retrieve Data

FilterCatalog

Filtered Earthquake

Catalog

CISN Testing Center (CTC) Forecast Evaluation Processing System

CISN EEW Testing Center (CTC) and Web Site

ANSS Earthquake

Catalog

CISN Decision Modules

CISN User Modules

Load Reports

ShakeAlert Earthquake ParameterForecast

ShakeAlert Ground Motion

Forecast

Observed ANSS EQ Parameter and

Ground Motion Data

ShakeAlert Forecast

Information

Evaluation testscomparing

Forecasts andObservations

ShakeMap RSS Feed Ground Motion

Observations

1. Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present).

2. Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system.

3. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts.

4. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event.

5. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps.

6. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

8

CTC Progress in 2010

1. Operating algorithm evaluation system with California-based performance reports and raw data available (2008-present).

2. Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system.

3. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts.

4. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event.

5. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps.

6. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

10

CTC Progress in 2010

11

Current ShakeAlert CTC retrieves ShakeMap RSS data and plots observations for all Mag 3.0+ earthquakes in California Testing Region as shown (left).

CTC Progress in 2010

1. Operating algorithm evaluation system with California-based performance reports and raw data available (2008-Present)

2. Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system.

3. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts.

4. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event.

5. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps.

6. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

12

CTC Progress in 2010

Earthquake Catalog

Retrieve Data

FilterCatalog

Filtered Earthquake

Catalog

CISN Testing Center (CTC) Forecast Evaluation Processing System

CISN EEW Testing Center (CTC) and Web Site

ANSS Earthquake

Catalog

CISN Decision Modules

CISN User Modules

Load Reports

ShakeAlert Earthquake ParameterForecast

ShakeAlert Ground Motion

Forecast

Observed ANSS EQ Parameter and

Ground Motion Data

ShakeAlert Forecast

Information

Evaluation testscomparing

Forecasts andObservations

ShakeMap RSS Feed Ground Motion

Observations

1. Operating algorithm evaluation system with California-based performance reports and raw data available (2008-Present)

2. Changed our automated software testing infrastructure from web-based (Joomla) to server-based (CSEP) system.

3. Added ShakeMap RSS reader into CSEP for use as a source of authorized observational data that will be used to evaluate earthquake parameter and ground motion forecasts.

4. Implemented a prototype EEW forecast evaluation test as plotting of PGV used in ShakeMaps for each event.

5. Began nightly automated retrieval of observational data from ShakeMap RSS and create observation-based ground motion maps.

6. Started implementation of ground motion forecast evaluation defined in 2008 CISN Testing Document.

14

CTC Progress in 2010

15

CTC Progress in 2010

Initial CTC Evaluation Test is defined in 2008 CISN EEW Testing Document (as updated July 2010). Previous Algorithm Testing Center did not implement this summary. Access to ShakeMap RSS ground motion observations makes automated implementation practical.

1. Prioritization of forecast evaluation tests to be implemented

2. SCEC science planning of EEW forecast evaluation experiments

3. Use of EEW in time-dependent PSHA information

4. Consider Extending ShakeMap format as CAP-based forecast exchange format.

– Send forecasts information (and time of report) to produce:– ShakeMap Intensity Maps– ShakeMap Uncertainties Maps

5. Consider ShakeAlert interfaces to support comparative EEW performance tests. Provide access to information for each trigger:

– Stations Used In Trigger– Stations Available when declaring Trigger– Software Version declaring Trigger

16

Scientific and Technical Coordination Issues

End

17

Proposed CTC Evaluation Tests

18

Rigorous CISN EEW testing will involve the following definitions:– Define a forecast– Define testing area– Define input data used in forecasts– Define reference observation data– Define measures of success for forecasts

19

Design of an Experiment

Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests.

– Summary 1: Magnitude– Summary 2: Location– Summary 3: Ground Motion– Summary 4: System Performance– Summary 5: False Triggers– Summary 6: Missed Triggers

20

Proposed Performance Measures

Summary 1.1: Magnitude X-Y Diagram

Measure of Goodness: Data points fall on diagonal line

Relevant: T2,T3,T4

Drawbacks: Timeliness element not represented

Which in series of magnitude estimates should be used in plot.

21

Experiment Design

Summary 1.2: Initial magnitude error by magnitude

Measure of Goodness: Data points fall on horizontal line

Relevant: T2,T3,T4

Drawbacks: Timeliness element not represented

22

Experiment Design

Summary 1.3: Magnitude accuracy by update

Measure of Goodness: Data points fall on horizontal line

Relevant: T3,T4

Drawbacks: Timeliness element not represented

23

Experiment Design

Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests.

– Summary 1: Magnitude– Summary 2: Location– Summary 3: Ground Motion– Summary 4: System Performance– Summary 5: False Triggers– Summary 6: Missed Triggers

24

Proposed Performance Measures

25

Experiment Design

Summary 2.1: Cumulative Location Errors

Measure of Goodness: Data points fall on vertical zero line

Relevant: T3, T4

Drawbacks: Does not consider magnitude accuracy or timeliness

Summary 2.2: Magnitude and Location error by time after origin

Measure of Goodness: Data points fall on horizontal zero line

Relevant: T3, T4

Drawbacks: Event-specific not cumulative

26

Experiment Design

Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests.

– Summary 1: Magnitude– Summary 2: Location– Summary 3: Ground Motion– Summary 4: System Performance– Summary 5: False Triggers– Summary 6: Missed Triggers

27

Proposed Performance Measures

28

Experiment Design

Summary 3.1 : Intensity Map Comparisons

Measure of Goodness: Forecast map matches observed map

Relevant: T4

Drawbacks: Not a quantitative results

Summary 3.2: Intensity X-Y Diagram

Measure of Goodness: Data points fall on diagonal line

Relevant: T1,T2,T4

Drawbacks: Timeliness element not represented

Which in series of intensity estimate should be used in plots T3.

29

Experiment Design

Summary 3.3: Intensity Ratio by Magnitude

Measure of Goodness: Data points fall on horizontal line

Relevant: T1,T2,T4

Drawbacks: Timeliness element not represented

Which intensity estimate in series should be used in plot.

30

Experiment Design

Summary 3.3: Predicted to Observed Intensity Ratio by Distance and Magnitude

Measure of Goodness: Data points fall on horizontal line

Relevant: T1,T2,T4

Drawbacks: Timeliness element not represented

Which intensity estimate in series should be used in plot.

31

Summary 3.3: Evaluate Conversion from PGV to Intensity

Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity.

32

Summary 3.4: Evaluate Conversion from PGV to Intensity

Group has proposed to evaluate algorithms by comparing intensities and they provide a formula for conversion to Intensity.

33

34

Experiment Design

Summary 3.5: Statistical Error Distribution for Magnitude and Intensity

Measure of Goodness: No missed events or false alarms in testing area

Relevant: T4

Drawbacks:

35

Experiment DesignSummary 3.6: Mean-time to

first location or intensity estimate (small blue plot)

Measure of Goodness: Peak of measures at zero

Relevant: T1,T2,T3,T4

Drawbacks: Cumulative and does not involve accuracy of estimates

Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests.

– Summary 1: Magnitude– Summary 2: Location– Summary 3: Ground Motion– Summary 4: System Performance– Summary 5: False Triggers– Summary 6: Missed Triggers

36

Proposed Performance Measures

37

Experiment Design

No examples for System Performance Summary defined as

Summary 4.1: Ratio of reporting versus non-reporting stations:

Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests.

– Summary 1: Magnitude– Summary 2: Location– Summary 3: Ground Motion– Summary 4: System Performance– Summary 5: False Triggers– Summary 6: Missed Triggers

38

Proposed Performance Measures

39

Experiment Design

Summary 5.1: Missed event and False Alarm Map

Measure of Goodness: No missed events or false alarms in testing area

Relevant: T3, T4

Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness

40

Experiment Design

Summary 5.2: Missed event and False Alarm Map

Measure of Goodness: No missed events or false alarms in testing area

Relevant: T3, T4

Drawbacks: Must develop definitions for missed events and false alarms, Does not reflect timeliness

Summary Reports for each M ≥ M-min: Key documents is 3 March 2008 document which specifies six types of tests.

– Summary 1: Magnitude– Summary 2: Location– Summary 3: Ground Motion– Summary 4: System Performance– Summary 5: False Triggers– Summary 6: Missed Triggers

41

Proposed Performance Measures

42

Experiment DesignSummary 6.1: Missed

Event map

Measure of Goodness: No missed events in testing region

Relevant: T3, T4

Drawbacks: Must define missed event. Does not indicate timeliness

End

43

top related