Title: Stefan Wiemer
1ZMAP OpenSHA OpenSAF?
Stefan Wiemer Danijel Schorlemmer Swiss
Seismological Service ETH Zurich Major
contributions by Edward (Ned) H. Field(USGS)
2Outline
- ZMAP a 10 year old idea/software for seismicity
analysis. - OpenSHA A new concept in Seismic Hazard
Assessment. - OpenSAF Dreaming on
3ZMAP
- Developed since 1993 with the intention of
providing a GUI based seismicity analysis
software. Mostly a research tool. - Described in an Seismological Research Letter
article in 2001. - Matlab based, Open Source (about 100.000 lines of
codes in 700 scripts). - About 100 150 users worldwide, used in about 50
- 70 publications.
4ZMAP - capabilities
- Standard Tools Maps, Histograms, cross-sections,
Time series etc. - Earthquake catalog quality and consistency.
Magnitude shifts, completeness, blast
contamination, etc. Real-time potential. - Rate change analysis, mapping of rate changes in
space-time. Significance. - b-value analysis, mapping of b as a function of
space and time. - Aftershock sequence analysis. Time dependent
hazard assessment. - Stress tensor inversion based on focal mechanism
data. - Time to failure analysis.
- Fractal dimension analysis, mapping of D.
5(No Transcript)
6(No Transcript)
7- b-values along the SAF Highly spatially
heterogeneous
8Example Mc after Landers
Completeness in the hours and days after a
mainshock is considerably higher. Could this be
improved?
9Example Spatial variability of Mc
- Completeness is temporally and spatially highly
heterogeneous. - A detailed Mc(x,y,z,t) history should be
constructed, maintained by the networks?
10Example Parkfield magnitude shift?
- What happened around 1995 to the catalog of the
Parkfield section of the San Andreas fault? - Catalogs should be monitored routinely in the
future to detect man-made (and natural)
transients early on.
11ZMAP what worked well
- Matlab based Efficient development, expandable,
widely available, largely platform independent. - Addresses a definite need in the seismological
community. - Nice research tool for those who know how to use
it.
12ZMAP limitations
- Too complex. Not stable enough.
- No systematic users support (lately Very limited
support). - No dedicated financial support to develop and
maintain the software. - Difficult to embed other codes (wrappers work
sort of, e.g., stress tensor inversions). - Does not work in parallel mode.
13ZMAP summary
- Has reached the end of its lifecycle?
- What would a new generation seismicity analysis
software do? - Can we make it GRID based? (Simulations can take
days to weeks) - Can we make it object oriented?
14Creating a Distributed, Community-Modeling
Environment in Support of the Working Group for
the Development ofRegional Earthquake
Likelihood Models(RELM)
Edward (Ned) H. Field(USGS)Thomas H.
Jordan(USC)
15OpenSHAA Developing, DistributedCommunity-Model
ing Environment for Seismic Hazard Analysis
Design Criteria open source, web enabled,
object oriented.
Implementation Java XML, although the
framework is programming-language independent,
and some components will be wrapped
legacy code (e.g., WG99 Fortran code).
16Source Attenuation Site Hazard
17Seismic Hazard Analysis
- Earthquake-Rupture
- Forecast
- Probability in time and
- space of all M5 ruptures
(2) Ground-Motion Model
Attenuation Relationships
Full waveform modeling
18OpenSHA
Web Site http//www.OpenSHA.org
SHA Framework SRL submission (Field, Jordan,
Cornell)
Design Evaluation SCEC Implementation Interface
Code Development Ned Field, Sid Hellman, Steve
Rock, Nitin Gupta, Vipin Gupta
Validation PEER Working-Group Test Cases
19(No Transcript)
20OpenSHA Objects
Time Span
Desired output is the probability that something
of concern will happen over a specified time span
Earthquake- Rupture Forecast
Generates Rupture Sources
IM
Rupn,i
Site
Type, Level
Each Source has N Earthquake Ruptures
Sourcei
Intensity-Measure Relationship
Probability of occurrence
21OpenSHA Objects
Intensity-Measure Type/Level a specification of
what the analyst (e.g., engineer) is worried about
Time Span
Earthquake- Rupture Forecast
Generates Rupture Sources
IM
Rupn,i
Site
Type, Level
Each Source has N Earthquake Ruptures
Sourcei
Intensity-Measure Relationship
Probability of occurrence
22OpenSHA Objects
Site Prob. Eqk Rupture The two main physical
objects used in the analysis
Time Span
Earthquake- Rupture Forecast
Generates Rupture Sources
IM
Rupn,i
Site
Type, Level
Each Source has N Earthquake Ruptures
Sourcei
Intensity-Measure Relationship
Probability of occurrence
23OpenSHA Objects
Time Span
Intensity-Measure Relationship One of the major
model component (a variety available or being
developed).
Earthquake- Rupture Forecast
Generates Rupture Sources
IM
Rupn,i
Site
Type, Level
Each Source has N Earthquake Ruptures
Sourcei
Intensity-Measure Relationship
Probability of occurrence
24OpenSHA
Eqk Rupture Forecast The other main model
components (A variety being developed in RELM).
Time Span
Earthquake- Rupture Forecast
Generates Rupture Sources
IM
Rupn,i
Site
Type, Level
Each Source has N Earthquake Ruptures
Sourcei
Intensity-Measure Relationship
Probability of occurrence
25Web-Based Tools for SHA
Time Span
Earthquake- Rupture Forecast List of
Adjustable Parameters
Intensity-Measure Relationship List of Supported
Intensity-Measure Types List of
Site-Related Independent Parameters
Site Location List of Site- Related Parameters
Intensity Measure Type Level (IMT IML)
Hazard Calculation
Prob(IMTIML)
26Source List
Time Span
Network Earthquake Catalog
Fault Activity Database
Earthquake Forecast
GPS Data (Velocity Vectors)
Historical Earthquake Catalog
Community Fault Model
27OpenSHA
We want the various models and community
databases to reside at their geographically
distributed host institutions, and to be run-time
accessible over the internet.
This is an absolute requirement for making the
community modeling environment both usable and
manageable.
28OpenSHA
Building this distributed, community-modeling
environment raises several issues that we dont
presently know how to deal with
- The distributed system must be easy to use, which
means hiding details as much as possible. - Analysis results must be reproducible, which
means something has to keep track of all those
details. - Computations must be fast, as web-based users
arent going to want to wait an hour for a hazard
map or synthetic seismograms. - Well need a mechanism for preventing erroneous
results due to unwitting users plugging together
inappropriate components.
29(No Transcript)
30The SCEC ITR collaboration is helping
(a few examples and lots of )
Grid Computing To enable run-time access to
whatever high performance computing resources are
available at that moment.
31The SCEC ITR collaboration is helping
(a few examples)
Knowledge Representation and Reasoning
(KRR) To keep track of the relationships among
components, and to monitor the construction of
computational pathways to ensure that compatible
elements are plugged together.
32The SCEC ITR collaboration is helping
(a few examples)
KRR and Digital Libraries To enable smart
eDatabase inquiries (e.g., so code can
construct an appropriate probability model for a
fault based on the latest information found in
the fault activity database).
33The SCEC ITR collaboration is helping
(a few examples)
Digital Libraries To enable version tracking
for purposes of reproducibility in an environment
of continually evolving models and databases.
34OpenSHAA Community-Modeling Environment for
Seismic Hazard Analysis
- An infrastructure for developing and testing
arbitrarily complex (physics based system level)
SHA components, while putting minimal constraints
on (or additional work for) the scientists
developing the models. - Provides a means for the user community to apply
the most advanced models to practical problems
(which they cannot presently do).
(summary)
35OpenSHA
More info available at http//www.OpenSHA.org
36(No Transcript)
37Back to good old Europe
- What can we learn from OpenSHA for ZMAP?
38NERIS offered an opportunity
- N6 - Task B. Building the foundation for a
community based Seismicity Analysis Framework
(OpenSAF). - The information contained in modern earthquake
data sets is currently exploited by seismologists
using a variety of independent tools (e.g.,
SSLib, ZMAP, Wizmap, GMT, Slick, Coulomb 2.2)
which have no interoperability or
standardization. Better and more efficient
exploitation of this information requires
integrating set of modern, interactive,
easy-to-use and accessible tools for
visualization, quality assessment, data mining,
statistical modeling, quantitative hypothesis
evaluation and many other tasks. Such integration
could be provided by a seismic data analysis
framework (OpenSAF) - a centralized, Internet
ready platform for accessing visualization and
analysis tools. OpenSAF would be designed to
interoperate closely with OpenSHA.
39The Future
- I learned I am more objective-oriented, not
object-oriented. - Developing OpenSAF in Java (or similar) would, in
our opinion, be a laudable objective however, it
would require a sustained effort and significant
financial support. Is it worth it in this case?
Or should we stick to a high level language? - Where could the support come from? How can one
make it a community-supported, sustainable
effort?
40The Future
- The alternative might be a new, modular, Matlab
based research program that avoids the mistakes
of the old ZMAP, and the ability to build
stand-alone, streamlined modules for specific
tasks (monitoring of completeness, rate changes,
artifacts ). A license fee from users that
raises about 1 man-year might be feasible.
The End