Destructive Earthquakes Have Happened

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

1. Introduction 1

1. Introduction

Many destructive earthquakes have happened in the past – catastrophes, from whom we learned

both increasingly about their occurrence and the behavior of buildings to shaking. But in reality,

earthquakes in the 21th century have caused a death toll of almost 600 000 by now (), secondary

effects such as tsunamis excluded. And there is likely to be no end to that because population is

continuing to flow into the worlds cities. In so doing, buildings are constructed in high density

and often with poor quality. Viewed globally, this becomes a big problem since many megacities

with rapidly growing numbers of people are lying in high seismic risk zones along the plate

boundaries ().

In geological time scale, big earthquakes happen continuously. But in human point of view,

they are very rare and reoccur, if any, over many generations. That’s why the cultural memory

of risk is lost very easily. This mannerism will not change unless seismic hazard is understood

properly and unless this hazard is conveyed to people who live in such high risk areas. Therefore,

to take further steps, the risk awareness in the society has to be raised globally (). Without this,

nothing will change regarding the rate of deaths and destruction in the near future.

In Switzerland,

Earthquakes can not be predicted. Therefore, prevention is a crucial necessity to limit the

degree of loss in high-risk areas (Wiemer2013SED ).

Such considerations are important for the Basel area, as shown hereafter.

1.1. Basel 1356

In 1356, the most devastating earthquake in the recorded history of central Europe occurred in

1

Basel, Switzerland (Figure 1). According to later studies , the Basel earthquake caused a max-

imum intensity of IX (see Appendix ?? for an interpretation of this intensity) and a magnitude

ranging between Mw 6.7 and 7.1 (). The maximum intensity and magnitude estimates were ob-

tained by applying elaborate techniques to historical documents and descriptions of earthquake

observations. Of course, because these reports were created by historians several centuries after

the earthquake occurred, some skepticism is in order - the reported effects of an earthquake

are often amplified and distorted for religious or political purposes. The following description

of the Basel quake is mainly based on the work of 2 , based on approximately 20 contemporary

eyewitness reports made shortly after the seismic event.

The mainshock shaking was reported to have damaged chimneys, parts of facades, high sections

of the city fortification, the city cathedral, and other churches. Additional damage was caused

by fire that spread easily from open fireplaces due to collapsed straw roofing, destroying other

structures within the city walls that had withstood the shaking (). The alleys were piled high with

debris and people were still threatened by the risk of collapsing walls. Although archaeological

investigations showed that some buildings must have remained undamaged, the city is reckoned

to have been uninhabitable at this point. The number of fatalities is poorly known, but it

1The SED revised its earthquake and macroseismic database in the 1990s, resulting in , the Earthquake Catalog

of Switzerland, which contains many of the now known historical earthquakes.

2This book covers all known major historical earthquakes in Switzerland with many comments and investigations.

: Forecasting Losses

----------------------- Page 2-----------------------

1. Introduction 2

is assumed that most of the 7 000 inhabitants (estimated) fled the city because of foreshocks.

Moreover, detailed description of damage is only available for the city of Basel and not the

surrounding regions. But remarkable damages were also experienced by churches, castles and

monasteries around Basel; between 46 and 80 seriously damaged castles within a 30 km range in

3

particular . This fact is proved either by archaeological findings or documented building work

which was undertaken some years after the event. There is even a note of a felt report in Paris,

approximately 400 km away, where church bells rang.

Fig. 1: This portrayal (originally a woodcarving) was first illustrated in Sebastian Münsters «Cosmographia»(1550)

and gives an idea of the damages caused by the 1356 Basel earthquake.

The tremor was a result of active normal faulting in the upper Rhine rift valley (). This

fracture is caused by the tectonic stresses induced with the Alpine orogeny, an effect due to the

north drifting African plate towards the Eurasian plate. As this process is still going on and past

earthquakes already weakened the brittle rocks within the active fault zone, it is thought to be

very likely that an earthquake similar in size could occur on the same fault4 again. is hereby

speaking of a return period of 2000 to 2500 years5 , a comparatively long seismic cycle. These

findings may partly ground on paleoseismic identifications (see ) of three ruptures in the past

8 500 years in the area of Basel, which must have created a cumulative vertical displacement of

1.8 m. This correlates to five to eight 1356-type earthquakes for the last 10 000 years. Hence, the

potential for strong shaking and therefore damages and losses is still present, although the Basel

region is characterized by low seismicity overall.

After the earthquake, the city was rebuilt rapidly within two decades and returned to daily

life surprisingly fast (). This quick recovery may be explained by the city’s relative wealth, but

also by the active self-initiative of the residents as well as neighbors and closer cities who sent

3The upper Rhine valley was among the regions with the largest amount of castles in Europe during the Middle

Ages ().

4 This fault is estimated to be 15 to 20 km long. Assuming measured vertical displacements of 0.5 to 0.8 m over

the entire seismogenic thickness of 15 km yields a possible Mw 6.5 earthquake ().

5Whereas a similar event within whole Switzerland is assumed to occur every 1 000 years ().

: Forecasting Losses

----------------------- Page 3-----------------------

1. Introduction 3

support and aid money. The fact that Basel’s neighbors sent aid suggests that the devastation

was not widespread. At that time nobody knew the reasons of such an earthquake, which is

why building engineers didn’t gave a thought to earthquake resistant design. Instead, natural

phenomena were interpreted as signs of divine wrath and punishment. The city was even tempted

to enact a collective penance and general abstinence ().

Today we know better. But hazard evaluation for this very area is still difficult, so it behooves

us to consider the case of a repeat of the 1356 earthquake.

1.2. The SEISMO-12 Scenario

As stated in , conducting a disaster scenario exercise can be useful in several aspects. Such an

exercise prepares government officials, policy makers, responder organizations and the general

public to deal with an emergency. Getting involved in such an undertaking not only helps

participants in learning how to react, but also how to plan and mitigate against this specific

hazard (risk education). Regarding an earthquake in the Basel region, residents can not recall

past damages or imagine the effects of such a disaster. That is why the scenario should be

designed so that as many people as possible, if not directly contributing, are made aware of

the potential risk. This can be easily done by a supportive media; But feelings associated with

hazard experience will not be properly evoked, which is important for raising risk perception ().

However, to facilitate useful training, the disaster scenario should take a genuine and authentic

course of action.

Such a scenario was recently conducted in Basel, called SEISMO-12 : In May 2012, the Swiss

Federal Office for Civil Protection initiated and managed a major campaign of simulating a

damaging earthquake, that was comparable to the one that struck Basel in 1356. The reported

historical testimonies help to reconstruct the kind of damage we would experience today: Ac-

cording to a damage simulation (()) that was carried out specifically for this exercise with an

assumed Magnitude of Mw 6.6, approximately 6.2 million people would be affected in the region.

The loss is assumed to reach up to 6 000 deaths, 18 000 seriously injured, 30 000 missing and

45 000 slightly injured people. More than 1 600 000 people would be temporarily homeless, and

750 000 would be permanently displaced. It is estimated that 750 000 buildings would be dam-

aged – and 160 000 would be uninhabitable of which 12 000 would collapse.6 A similar disaster

happened in Kobe (Japan) 1995, where a complete rebuilding took 5 years and cost 200 billion

USD.

To make this exercise realistic, a range of partners of the Civil Protection like the Swiss Army

and several cantonal executive staffs participated (). Even the neighboring countries Germany

and France were involved to enhance the international cooperation. Their crisis handling was

tested by confronting them with different challenges, i. e., outages of action forces, energy, com-

munication and traffic ().

The whole event had been over two years in preparation with hundreds of competent experts

engaged in developing a cutting edge scenario that required a comprehensive script. The main

objective was to exit the crisis towards reconstruction. Therefore, different events had to be

6These numbers are to be taken only as orders of magnitude.

: Forecasting Losses

----------------------- Page 4-----------------------

1. Introduction 4

managed; principally by going through a transition from daily life over an emergency phase into

a crisis.

The actual command post exercise lasted for two days around the clock and was focused on an

effective emergency management after an earthquake. During this time, directors were issuing

tasks and scenario details to their teams from the headquarters (). Confronted with a continuous

flow of problems, they had to make quick decisions on how to react to buried survivors, power

outages and fires. An evaluation of the performance followed subsequently. The exercise was

completed by a crisis management meeting later that year, during which strategies to recondition

the region were developed.

The repetition (and the evaluation) of exercises like SEISMO-12 are an absolute requirement

to ensure an effective Civil Protection in case of emergency – also because the tutoring of the

joint working collaboration between affected authorities builds trust. In this case, it was not only

helpful for Basel but for the whole country, since severe earthquakes are the natural threats with

the greatest damaging potential in Switzerland (). Consequently, disaster preparedness is an

important step to reduce the damage potential of seismic shaking. Swiss federal authorities have

started an earthquake mitigation programme in 2000 () focusing on exactly this subject through

prevention measures. It aims to implement earthquake-proof construction as well as information

and tools provision among cantons and communities.

1.3. Current Situation

but systematical assessment of real-time seismic risk in terms of forecasting the damage to the

society is still very uncommon ().

However, an a posteriori seismic risk assessment was recently conducted for Basel (), after

7

an earthquake with a local magnitude of ML 3.4 was induced in December 2006 . The shaking

was widely felt8 within the city of Basel and lead to increased awareness of the public. While

the Geothermal project was on hold, an independent risk analysis study by a consortium of

seismologists and engineers was undertaken to investigate whether the project can be continued in

terms of financial loss, number of damaged houses and probability of fatalities. For this purpose,

a further stimulation and operational phase was considered concluding that the risk would be

unacceptable. For this reason it was decided to be shut down meaning that the geothermal

project finally failed. It is worth noting that, according to this study, such geothermal activities

would barely influence the probability for large natural events like the Basel 1356 earthquake,

because the imparted stress is assumed to be very small.

The SED participated in monitoring the seismicity during the drilling and stimulation phase

and made efforts in developing tools for Induced Seismicity Hazard Assessment. One example

is the introduction of real-time statistical forecast models for geothermal systems () describing

time-dependent seismic hazard. More recently, () quantified the seismic risk of induced seismicity

incorporating a well-established loss estimation routine and available building inventory data of

7The injection – necessary for stimulating the reservoir for the Enhanced Geothermal System (EGS) – had

already been stopped several hours before, after a ML 2.7 event took place.

8The earthquake caused slight non-structural damage with an insured amount of 7 Mio. USD ().

: Forecasting Losses

----------------------- Page 5-----------------------

1. Introduction 5

Basel. Such developments can serve as input for an advanced alarm system which would improve

decision making in future EGS projects.

1.4. Purpose

The main objective in this thesis is the development

1.5. Approach

including hazard (section 3.1) exposure and vulnerability (section 3.2)

The following approach is based on the STEER code (van Stiphout), which consists of three

major modules: (1) calculate the probability of occurrence of triggered earthquakes, (2) determine

probabilistic seismic risk (PSRA) to a probabilistic loss curve (PLC) by combining the results of

the first step with the loss estimation and finally (3) use the PLCs with a cost-benefit analysis

(CBA) for deciding the mitigation action.

As suggested by (), the next quential improvement is replacing the simple Reasenberg & Jones

model (RJ) by ...

This, of course, requires to solve new problems, but will lead [hopefully] to more refined results

in the end.

The chapters and sections are arranged in the way, how the data is passed through.

Compared to , I reconsidered the term "Loss Estimation Part" which is chosen when accessing

the QLARM routine. As the output is not linked with the probabilistic hazard yet, the term

"Vulnerability and Exposure Assessment" is more adequate and will be used as a replacement

hereafter. Compared to , I would like to discriminate between (1) the aforegoing calculation of

losses for specific intensities done by the QLARM routine and the (2) final combination of (1) with

the probabilistic hazard by introducing 2 separate terms: loss estimation and loss assessment.

This should help to clarify the difference between them and will be used hereafter.

9

The codes that were made available by vanStiphout were implemented in Matlab . Thus, it

seemed natural to continue with this easy-to-learn programming language.

9 Matlab is commercial software package that is primarily designed for solving numerical problems using matri-

ces. It features integrated plotting and debugging functionality and has its own programming language.

: Forecasting Losses

----------------------- Page 6-----------------------

2. Probabilistic Earthquake Forecasting 6

2. Probabilistic Earthquake Forecasting

2.1. Foretelling Seismicity

The prediction of earthquakes is probably the most commonly expected progress that could be

achieved by earth scientists. Without revealing too much for now, we can say that this is not

yet possible. However, an alternative attempt for describing the future seismicity has emerged:

earthquake forecasting . Both of them, predictions and forecasts, contain specific statements

about the location, the occurrence time and the size of impending seismic events. In earthquake

science, a prediction suggests that a target event will occur (or will not occur) within a specific

region and time (). Therefore they are defined as a deterministic or categorical statement (; ).

On the other hand, a forecast involves a probabilistic statement that such an event will occur

in a space-time domain (). In fact, a prediction has a higher probability and is more sharply

delineated than a forecast. The latter is shaped by relatively low probabilities and often issued

over longer time scales. This terminology is different from the everyday English language where

the terms are roughly interchangeable ().

Being able to reliably provide information about future earthquake activity is essential for

reducing seismic risk () and planning rational risk mitigation actions (). As this is of high

interest, many scientists have dedicated their research for providing information about upcoming

fault ruptures.

Earthquake Prediction

Deterministic earthquake prediction was a primary goal in seismology for several decades. Various

prediction programs started in the 1960s and 1970s in the U.S., China, Japan and the USSR (),

tending to find and identify precursors to earthquakes or trying to understand the seismic cycle.

A precursor is a signal supposed to be observable before an impending earthquake. In order

to reliably predict an event’s location, time and magnitude, i. e., ensuring high probability and

low error rates, the precursor should be diagnostic (). The common strategy is to examine the

correlation of a precursor with an ongoing unknown process in advance of a fault rupture. Many of

precursors have been so far studied, either with physical background (e. g., strain-rate changes,

seismic velocity changes, hydrological changes, electromagnetic signals), chemical background

(e. g., radon emission), biological background (e. g. anomalous animal behavior) or seismicity

related (e. g., seismicity patterns, accelerating moment release). A description of these and other

can be found in and .

In the 1970s, the optimism was high that within a decade, seismologists would be able to

reliably predict future earthquakes (), especially because a successful prediction of the 1975

Haicheng (China) earthquake saved many lives (; ). Its prediction process remained mysterious

for a long time. It was believed that anomalies such as changes in groundwater level or strange

animal behavior (Fig. 2) aided the final evacuation decision (). But showed that foreshocks10

alone lead to a warning issuance by the government on the day of the earthquake, as they already

10Moreover, the foreshock sequence within 24 hours before the main shock were indicative of the approximate

location of the impending earthquake ().

: Forecasting Losses

----------------------- Page 7-----------------------

2. Probabilistic Earthquake Forecasting 7

frightened the public. Nevertheless, substantial programs were initiated in the aforementioned

countries (). Researchers have reasonably been enthusiastic to predict destructive earthquakes

in order to prevent losses.

Fig. 2: Anomaly reports based on interviews of eyewitnesses right after the Haicheng earthquake, which occurred

on February 4, 1975. For a long time, it was believed that such precursors were used for a successful prediction

and evacuation. Taken from .

But since additional success was elusive and most claimed predictions turned out to be false

alarms, the belief in a prediction ability was widely lost. Although the search for precursors was

far-reaching, none has proved reliable. They often produced controversial results and showed high

error rates (). It was common to publish findings to restricted cases or claiming in retrospect that

a clear signal responsible for a particular earthquake has been found (), thus emphasizing more

positive than negative results. Moreover, substantial variation in the signal was often accepted

and the correlation between proposed precursors and subsequent earthquakes not adequately

tested ().

The report of the International Commission on Earthquake Forecasting for Civil Protection

() notes that such a "silver bullet" approach has not yet demonstrated a successful earthquake

prediction scheme providing diagnostic capabilities. However, precursory research has greatly

helped to gain new insights of the underlying earthquake process. Further research on that topic

should therefore not be categorically excluded.

Earthquake Cycle

Just recently, the phrase seismic cycle and its associated terms characteristic earthquake and

seismic gap were critically examined by ; ; . Ideally, an isolated fault subjected to tectonic

loading would build up shear stress until it suddenly ruptures and the shear stress begin to

increase again (elastic rebound theory). Accordingly, repeating earthquakes on the same fault

appear to be characteristic (having similar properties and a periodic recurrence interval) and

would dominate the displacement on the fault with the maximum magnitude. This view further

implies the invalidity of the Gutenberg-Richter relationship (see section 2.3), as large events occur

at higher rates than expected from small ones (Fig. 3).

In reality, earthquakes scarcely repeat in quasi-constant time intervals and will more likely

vary irregularly due to incomplete stress release (no complete relaxation), rupture area variation

and interactions with other fault segments (complexity of fault segments) (). Three prospective

: Forecasting Losses

----------------------- Page 8-----------------------

2. Probabilistic Earthquake Forecasting 8

)

s

t

n

e

v

E

f G

o u

t

r e CharacteristicCharacteristic

n

e b

b e

r

m g{R

u i

c

N h

( t

e

g r

o

l

Magnitude

Fig. 3: Sketch of magnitude–frequency distributions (FMD). A typical Gutenberg–Richter relationship according to

the linear power law is shown in black. It can be derived from instrumentally or historically recorded earthquakes.

But characteristic earthquakes are assumed to occur at a higher rate, resulting in a bump at larger magnitudes

(gray).

prediction experiments11 from the past, based on the characteristic earthquake hypothesis, are

briefly reviewed in : All of them failed, as no target event happened, or "repeated", in the

predicted time window. Many statistical tests of such predictions revealed worse performance

than random guessing (). This applies to the "seismic gap" hypothesis as well, which has not

been successful in telling future earthquake locations yet (; ).

The seismic gap hypothesis depends entirely on the previous remarks and assumes that faults

with no recent earthquakes (relative to their "recurrence interval") are most likely to rupture in

the future – the supposed seismic gap. "Gap models assume quasi-periodic behavior of something,

and that something must be characteristic earthquakes [representing, again,] the largest possible

on a segment" (). Because a region would be afterwards assumed as free of future shocks, a

mistakenly all-clear signal will likely be issued. Moreover, recent verification suggest a general

lower seismic potential in the gaps, whereas active segments have a higher potential; thus implying

that recent events don’t make plate-boundaries safer ().

Well-known counterexamples however, effectively called "uncharacteristic" earthquakes, tell

a different story: the disastrous 2004 Sumatra and 2011 T¯ohoku mega-thrusts ruptured along

several fault segments assumed to be separated from another (). According to , events with

large magnitudes near Basel must also be called "uncharacteristic", as paleoseismic investigations

showed a lower rate compared to historical records (Fig. 4 and ). Compared to the return periods

mentioned in subsection 1.1, extrapolating the activity rate of smaller earthquakes would result

in a 4-fold increase of events of the same kind for the 10 000-year period. Although paleoseismic

data may give further insight regarding large events over longer time intervals, it is difficult

to refer to them as mandatory facts. argues with subjective sample collections and possibly

insignificant sedimentation caused by historic events, thus lacking sufficient evidence. Keeping to

the identifications found by for Basel, they date back to 1440 A.D., 750 A.D., 525 B.C. and 6345

B.C. (mean values) yielding mean recurrence durations of 690, 1275 and 5820 years. Normalizing

11This includes the most noted Parkfield, California prediction. Supposed recurrence intervals were much shorter

(around 20 years) than for a 1356-type Basel earthquake, so apparently enough historical records were available

on which the predictions could rely on.

: Forecasting Losses

----------------------- Page 9-----------------------

2. Probabilistic Earthquake Forecasting 9

Fig. 4: Taken from . This shows the cumulative frequency-magnitude distribution of seismicity in the greater

Basel region of the past 1000 years. Historical (from the ECOS catalog) and paleoseismic data show similarities

at low magnitudes but diverge at the higher end. This becomes more evident for return periods greater than the

475-year interval.

to a 10 000 years time interval would indeed equal 5 events - but these intermediate time durations

are surprisingly increasing with event age, possibly being an indication for decreasing detection

quality of older events.

Based on the upper mentioned considerations, we can not be sure about when the next earth-

quake will rupture the identified fault near Basel. This is a common problem with predictions: we

might have a vague clue of how frequently they will occur given geologic or possibly instrumental

data (), but not of the exact time. On the other hand, we have a better understanding of where

large earthquakes are likely to happen, although there have been surprises as well, as discussed

later.

Probabilistic approaches

Owing to the currently limited scientific understanding of the earthquake process, we are not able

to reliably predict large earthquakes (). For this reason, seismologists have widely abandoned

further attempts in deterministic prediction and focused more on the development of probabilistic

models for earthquake forecasting in the last 15 years or so (). It is now a rapidly evolving field of

earthquake science (). Probabilistic forecasting models yield prospective statements of earthquake

occurrence across different time intervals, ranging from short-term (months or less) to long-term

(several years or decades) () where even low probabilities can be useful.

Long-term models are also referred to as time-independent models, because earthquakes are

described to be a random (Poisson) process in time and therefore independent from historic

events. A long-term forecast is an empirical description of the observed seismicity () and thus

depends solely on the long-term rates of the target events. Although the probabilities of large

events are small compared to a fraction of their assumed recurrence intervals (), they can, to

: Forecasting Losses

----------------------- Page 10-----------------------

2. Probabilistic Earthquake Forecasting 10

some extent, already be used as valuable input to protect against earthquake damage – such as

guiding safety provisions of building codes – because the future behavior of seismicity can be

approximately determined.

Forecasting probabilities can be combined with seismic wave propagation and site effects to

calculate the expected occurrence of ground shaking in a specific time period, the so-called seismic

hazard (see subsection 3.1). If the probabilities correspond to a long time period, they belong to

the domain of seismic hazard analysis (). Seismic hazard maps are traditionally used to forecast

the ground motion to be expected in a time interval of 30 years or more. They incorporate the

knowledge we have about the historical seismicity restated to a time-independent seismic hazard.

Thus, they tend to create the impression of an underlying earthquake occurrence process which

is constant over time ().

Nevertheless, recent analysis has shown that the Global Seismic Hazard Assessment Program

(GSHAP)12 , consistently underestimates the actual intensity of strong earthquakes (; ). For all 60

events with magnitudes M 7.5, which happened in the period since 2000, the maximum observed

intensity was larger than expected and thus these earthquakes appeared to be "surprises" 13 for

the GSHAP map; including the 12 most deadliest earthquakes from this period, for example the

Mw 7.0 Haiti earthquake (Fig. 5).

4.84.8

C )

u 2

b 4.04.0 s

a /

m

d (

o n

i

20Ëš20Ëš20Ëš20Ëš20Ëš20Ëš20Ëš20Ëš20Ëš20Ëš20Ëš r o

e 3.23.2 i

t

p a

r

n a

r l

u e

t c

Haiti e 2.42.4 c

r

Dominican Republic r A

Port−au−Prince Santo Domingo a d

e n

y u

− 1.61.6 o

Kingston 5 r

7 G

18Ëš18Ëš18Ëš18Ëš18Ëš18Ëš18Ëš18Ëš18Ëš18Ëš18Ëš 4

M 7.0 k

w a

0.80.8 e

P

0.40.4

0.20.2

0.00.0

−78˚−78˚−78˚−78˚−78˚−78˚−78˚−78˚−78˚−78˚−78˚ −76˚−76˚−76˚−76˚−76˚−76˚−76˚−76˚−76˚−76˚−76˚ −74˚−74˚−74˚−74˚−74˚−74˚−74˚−74˚−74˚−74˚−74˚ −72˚−72˚−72˚−72˚−72˚−72˚−72˚−72˚−72˚−72˚−72˚ −70˚−70˚−70˚−70˚−70˚−70˚−70˚−70˚−70˚−70˚−70˚ −68˚−68˚−68˚−68˚−68˚−68˚−68˚−68˚−68˚−68˚−68˚

Fig. 5: This map features hazard data of GSHAP produced prior to the Mw 7.0 2010 Haiti earthquake.

The hazard is depicted as peak ground acceleration (PGA) returning every 475 years (footnote 12). The

GSHAP data was derived from www.seismo.ethz.ch/static/gshap and identified tectonic boundaries from

www.ig.utexas.edu/research/projects/plates/data.htm . Subduction zones have red, transform faults green

and ridges brownish color. The websites were accessed on February 26, 2013.

The fundamental problem persists in the earthquake records, which are generally too short

to appropriately form a true representation of the prevailing seismic activity for many regions,

especially of large earthquakes (). For instance, a return period of 475 years implies available

observation data from a few thousand years or more (), because regions never known to be seis-

mically active can entail irregular sequences of earthquakes (). Such information is not accessible

with an accuracy that instrumental seismicity catalogs provide today. Rare but devastating

12 GSHAP specifies the world seismic hazard as peak ground acceleration (PGA) not to be exceeded with a 90%

probability in 50 years, corresponding to a return period of 475 years ().

13In other words, "surprises" are inconsistencies between the GSHAP expectations and real events, provoking in

some cases unexpected human losses and damage within the earthquake zone ().

: Forecasting Losses

----------------------- Page 11-----------------------

2. Probabilistic Earthquake Forecasting 11

earthquakes often occur contrary to our expectations. When this happens, former maps have to

be updated in regions missing such observation. points out that a bias is commonly introduced in

new maps when doing so – an effect referred to as "Texas sharpshooting". Of course, assessment

of maps may take hundreds of years.

Earthquake occurrence is a lot more complex than described by seismic hazard maps and

their underlying models. In fact, seismic hazard is known to change with time (), partly be-

cause earthquakes release energy and alter the tectonic stress instantaneously within the fault

systems, eventually causing further seismic events. Such fault interactions are responsible for

an obvious but remarkable phenomena of seismic activity: the clustering of earthquakes in both

space and time. Typical examples of this behavior are foreshock-mainshock-aftershock sequences

(Fig. 6) or seismic swarms (). For instance, the spatiotemporal clustering is very dominant in

the days and weeks after a large earthquake (), as it can trigger many additional events near by,

so-called aftershocks. Utilizing the fault interaction feature would enable us to deduce variations

of earthquake occurrence probabilities, either by more realistic numerical simulations of com-

plex fault systems ("earthquake simulators", and references therein) or indirectly by statistical

(probabilistic) modeling and forecasting.

notes that "for most decision-making purposes, probabilistic forecasting provides a more com-

plete description of prospective earthquake information than deterministic prediction". When

properly applied, corresponding models are vital for operational intentions (see subsection 1.3 on

page 4) with forecasting intervals from hours to years or more. This approach is relevant for this

thesis, especially when applied to short-term earthquake forecasting. A comparison with mete-

orology illustrate its importance: long-term forecasting (e. g., seismic hazard maps) correspond

to climate as short-term forecasting (e. g., operational applications) correspond to weather. The

latter has recently attracted a growing interest (). The next section will explain this topic in

more detail.

2.2. Short-term Earthquake Forecasting

Many features recorded in seismic catalogs, like aftershocks, can be explained by statistical de-

scription of the earthquake occurrence process (). It is well known that the seismic rate has

more evident variability over short time scales than assumed by a time-independent (long-term)

process (). This is especially true in the aftermath of a large earthquake and manifests itself in

clustering of earthquakes in time (Fig. 6) and space (e. g., along faults). The physical reason for

this aftershock triggering is the instantaneous stress change imparted by the passage of seismic

waves or a permanent fault slip of the crust and mantle (). In particular, considering this spa-

tiotemporal clustering and processing the information that is available at a specific time enables

creation of time-dependent forecasts of the future seismicity in terms of probability changes.

This can answer questions that no long-term, time-independent approach can provide, namely

regarding the arising hazard in the advent and during an earthquake sequence. Hence, short-

term forecasting provides operational utility () with "foremost importance for taking sudden risk

mitigation actions" (). This is crucial for rebuilding a city for example. On the other hand, they

principally allow to model foreshock probabilities as every earthquake is capable of triggering an

: Forecasting Losses

----------------------- Page 12-----------------------

2. Probabilistic Earthquake Forecasting 12

Mainshock

Foreshocks Aftershocks

e

t

a

r

e

k

a

u

q

h

t

r

a

E

background

Time

Fig. 6: Schematic drawing of earthquake clustering in time. The mainshock triggers an aftershock sequence; prior

events are potential foreshocks. Of course, a foreshock can only be defined as such until a bigger event (the

mainshock) occurred. Adapted from .

event with a larger magnitude ().

Forecast Models

Short-term statistical models, also known as "earthquake triggering models" (), intend to capture

the excitation of aftershocks and clustering of seismic sequences in order to calculate the future

distribution of aftershocks within space-time domains. This is achieved by applying specific

empirical statistical laws of earthquake occurrence (see subsection 2.3, page 16). Short-term

models currently supply the highest validated information gain we have ().

Various formulations have been implemented for forecasting earthquakes over short time inter-

vals. One of the most widely utilized concepts of aftershock triggering is the idea of the Epistemic

Type Aftershock Sequence (ETAS), first introduced in . It represents perhaps the most diffuse

class of daily and weekly forecast models (). Thus, ETAS is well established and based on simple

physical assumptions such as a spatially variable seismic background (represented by a station-

ary Poisson distribution) and isotropic triggering. declares this approach as a "multi-generation

model", because all earthquakes can stochastically trigger other earthquakes according to empir-

ical scaling relations – no distinction is made between a mainshock and an aftershock. An other

example is the Short-Term Earthquake Probability (STEP) model (; ), in which – contrary to

ETAS – all aftershocks are supposed to be triggered by one single mainshock, hence a "single-

generation model". The next subsection 2.3 on page 16 is dedicated to this very model. Purely

physics-based models, however, have not shown to be applicable in near real-time () – in contrast

to statistical forecasting models.

Even though research is conducted to a great extent on short-term earthquake forecasting and

some countries already perform appropriate modeling of aftershocks triggering (), operational

procedures are just now slowly beginning to become standardized. It is complicated, because

potentially destructive earthquakes are difficult to forecast for specific regions within the next

decades or less. As a result, the probabilities of such events may vary over orders of magnitude

but typically remain low at all times (< 1% per day) (). Hence, the interpretation for prompting

protective actions is rather difficult – whereas the potential value of long-term seismic hazard

analysis for ensuring seismic safety is far more clear and can guide building codes and insurance

companies.

: Forecasting Losses

----------------------- Page 13-----------------------

2. Probabilistic Earthquake Forecasting 13

Until now, no short-term forecast model was able to demonstrate sufficient reliability to esti-

mate the occurrence of a damaging event (). "This is just another way of stating that the extant

forecasting models cannot provide high-probability earthquake predictions" (). They rely on

empirical relations and model the seismicity in a stochastic manner – rather than by a physical

simulation of a faulting process – thus introducing temporal and spatial inconsistencies. A com-

bination with earlier mentioned physics-based earthquake simulators may advance forecasting

methods by properly accounting for the stress interactions in a fault network (). But earthquake

clustering, let alone earthquake rupture process, is not yet understood in detail.

Nevertheless, aftershock forecasting models are already powerful in tracking the space-time

evolution of triggered events (), and already have been used for automated, near-real-time appli-

cations (, see subsection 3.1.1). Furthermore, a degree of usefulness has been shown for instance

in with first attempts of an operational use during a crisis. Clearly, there still persists a difficult

challenge in effectively making use of forecasting probabilities for decision-making purposes. Al-

though they are predominantly aimed at aftershock hazard and not large, independent events (),

a benefit for civil protection has yet to be demonstrated. For this reason, governmental authori-

ties possessing legal responsibility have been careful about the usage of forecasts with operational

capabilities (). However, the public is expecting more and more available and effectively utilized

substantial information about potential seismic threats ().

Forecast Model Testing

"The basic problem in earthquake forecasting is to obtain the best model" (). As shown before,

several models exist. For accepting and implementing any forecasting routines, an assessment of

their (probabilistic) performance is necessary. Whether a method is worthwhile for a real-time

application with operational benefit is to be judged on the basis of its "operational fitness" (). The

quality of the method, but also its consistency and the value to decision makers determines the

operational fitness. points out, that the quality of a forecasting method is defined over reliability

and skill. The quality itself is a statistical measure for the agreement of the forecast with

respect to observations collected over many trials. Reliability in particular quantifies how well

the forecast probabilities match the observed earthquake occurrence. It is therefore an absolute

measure. Skill on the contrary is a relative measure, assessing the performance of one method

to another and can therefore be used to verify a candidate model. A common used reference

model is a time-independent forecast14 based on the previous earthquake distribution – e. g., a

smoothed seismicity model15 (, p. 79 and Fig. 7, lower right).

A model must provide a certain degree of reliability and skill to be useful for operational pur-

poses. As this is not only of public interest, the Southern California Earthquake Center (SCEC)

and U.S. Geological Survey (USGS) set up a working group on Regional Earthquake Likelihood

Models (RELM) in 2006 (, and references therein). It intended to compare the performance of

14The reference forecast assumes large earthquakes as independent events occurring randomly in time at a long-

term rate. According to the Poisson distribution, earthquakes are then independent of the forecast time ().

15A smoothing approach is able to incorporate spatial clustering and thus can perform considerably better than

a model based on uniform distribution. Moreover, this reference forecast can be improved by optimizing the

smoothing kernel ().

: Forecasting Losses

----------------------- Page 14-----------------------

2. Probabilistic Earthquake Forecasting 14

time-independent earthquake forecasting models in California using upper mentioned principles.

Fig. 7: Six of eighteen different long-term forecasting models submitted for prospective testing to CSEP for the

Italian testing region. The rate forecast for the next 5 years is color coded with different scale. A smoothed

seismicity model (TripleS, ), usually used as a reference forecast, is depicted in the lower right. The figure is a

slightly modified cut out taken from .

Meanwhile, the five-year RELM project inspired the development of a more advanced infras-

tructure, the Collaboratory for the Study of Earthquake Predictability (CSEP; cseptesting.org;

). CSEP, which is active on a global scale, has been formed by a large international partnership

of geoscientists, statisticians, and computational scientists for developing, testing and comparing

earthquake occurrence models. Testing centers are maintaining secure, controlled computational

environments for conducting the experiments, which means that they collect observations, gen-

erate the forecast for every model and perform evaluation tests accordingly in order to finally

access the experiment results (). Four of them (including ETH Zürich) are currently operating

for a variety of testing regions, the target geographic area for a given experiment.

CSEP attempts to advance earthquake research in a way that the whole science community

can benefit of, and not just only individual groups of researchers (). Its objective is to contribute

to the understanding of the characteristics of the earthquake system by undertaking rigorous

empirical testing (). For this purpose, the testing procedures have to fulfill certain criteria, such

as exact reproducibility, independent validation and full transparency.

As one can now imagine, testing is the central element of CSEP research. This is achieved by

making use of the scientific hypotheses of underlying the models. From previous explanations,

it becomes clear that the key requirement is establishing a reference forecasting model. This

first order approximation can represent a null hypothesis, against which the models have to

: Forecasting Losses

----------------------- Page 15-----------------------

2. Probabilistic Earthquake Forecasting 15

demonstrate skill. Because experiment evaluation is essential, a range of carefully engineered

testing methods have been developed for CSEP (see ) in order to measure the absolute and

relative performances in respect to data and competing models. Further tests can be added in

case there is a demand for them. In doing so, all of the models are conducted prospectively16 ; but

also retrospective17 testing is useful to verify if a model is properly implemented and installed ().

Up till now, not only short-term forecasting models, but rather a variety of long-term models

are currently under test in California, New Zealand, Japan, Italy and are planned for example

in China (). When adding new regions, the modular approach of CSEP becomes important for

extensibility, as the same models can be implemented in a simple way ().

Regarding time-dependent forecasts, all of them are based on observed seismic activity, because

no method based on non-seismic precursors has demonstrated sufficient reliability and skill yet.

16In a prospective test, the forecast is executed using only data available at that time (). The outcome is later

compared to the observation made. A controlled environment is necessary for truthful results.

17For retrospective testing, all available data can be used for comparing the forecast with the observation.

: Forecasting Losses

----------------------- Page 16-----------------------

2. Probabilistic Earthquake Forecasting 16

2.3. The STEP Model

The Short-Term Earthquake Probability (STEP) model (; ) is an aftershock forecasting frame-

work, not only capable of producing a rate forecast, but time-dependent seismic hazard as well.

The reason for implementing an additional hazard modeling approach is obvious: An earthquake

forecast expressed in terms of ground motion provides better "means for increased awareness of

seismic hazard in the general public as well as assisting in improving our scientific understanding

of short-term seismic hazard" (). This distinguishes STEP from other forecasting models. STEP

has been released as a real-time tool to the public in 2005 and was running for some years while

regularly making automatic recalculations of the short-term seismic hazard based on revised af-

tershock statistics (). How the hazard is calculated in particular from the rates is subject in a

separate section (3.1.1). The following section focuses on the description of rate forecasting first,

which is, unless otherwise stated, acquired from , or the STEP code itself.

The Reasenberg & Jones Model

STEP is build upon the work of ; The Reasenberg & Jones ("RJ") forecasting model18 represents

a first attempt of quantifying the probabilities of impending aftershocks in the near future. It is

based on two of the most established basic laws in seismology: (1) the Gutenberg–Richter law,

characterizing the earthquake size distribution (Fig. 8a), and (2) the Omori–Utsu law, describing

the decay of aftershock activity (Fig. 8b).

4

a) b) 10

Cumulative

4

10 Noncumulative y

a

d 3

10

r

r e

e p

b 103 s

m e 2

k

u b = .72 § .01 a 10

N u

q

e 2 h p = 0.74 § .02

v t

i 10 r

t 1

a a 10

l e

u

f

m o

u r

C 101 e 0

b 10

m

u

N

0

10

-4 -2 0 2 4

2 4 6 10 10 10 10 10

Magnitude Days following mainshock

Fig. 8: Exemplary seismicity analysis for the time period following the 1989 Mw 6.9 Loma Prieta (California)

mainshock until 2001. Left: value estimate using Gutenberg–Richter law. Depicted are cumulative and non-

cumulative frequency–magnitude distributions (FMD). Right: Aftershock decay and value estimate using the

Omori–Utsu law. Both figures are adapted from .

Gutenberg–Richter Law This law assumes the proportionality between the number of

earthquakes and the earthquake magnitude – in the case of short-term forecasting it is

applied to aftershocks in particular. According to , a linear relation is shown when plotting

18The information gained by applying the RJ model to sequences in California, Turkey and Japan was well

received by the scientific community.

: Forecasting Losses

----------------------- Page 17-----------------------

2. Probabilistic Earthquake Forecasting 17

the logarithm of the earthquake frequency over the earthquake magnitude. The latter is in

turn a logarithm of the released energy of earthquakes. Thus, the observed linear behavior

in the log-log-plot can be described by the power law

(2.1)

with representing the number of expected earthquakes above a given magnitude

and the value as the relative seismicity rate referred to a certain magnitude,

usually . The -value parameterizes the slope of the distribution and is therefore

related to the ratio of small to large events19 . Accordingly, there are numerous small (high

probability) and few large (low probability) events. The linear relationship is especially

true in large-scale time and space domains with sufficient sample size.

Omori–Utsu Law The second assumption made in the RJ model is the temporal decay

of aftershock activity, (conforming)/(conformed by) the modified Omori law (), also known

as Omori–Utsu law. The aftershock rate of the sequence depends on the time since

the mainshock according to

(2.2)

where is a function of the number of events ( Utsu-scaling; exponentially increasing with

mainshock magnitude; see New Generic on page 23), and and as empirical parameters.

The value accounts for the fact that aftershock detection might be complicated due to

the dominating seismic signal (coda) of the mainshock early on. Hence, is typically

set to a fraction of a day. The value describes the decay rate of the overall aftershock

process and is usually not much different from unity (). According to this inverse power

law, the aftershock activity will decrease rapidly with time.

By combining the scaling relations of Gutenberg–Richter and Omori–Utsu, a stochastic model

that gives the rate of aftershocks with magnitude or larger at time , following

a mainshock of magnitude , can be expressed as

(2.3)

Hence, the frequency of daughter events falls off (1) exponentially in time and (2) with decreas-

ing mainshock magnitude because bigger earthquakes produce more and larger aftershocks. For

practical purposes, we want to generate a rate forecast that best models the behavior of previous

aftershocks. Therefore, the statistical parameters , and of the empirical law have to be

determined from past seismicity first. By applying a cluster recognition algorithm20 on an earth-

19The b-value is usually close to one (), which corresponds to a tenfold increase of event frequency for magnitudes

smaller by one unit.

20 Cluster recognition applied to a seismic catalog is connected to seismicity declustering (vanStiphout2012

), where independent (background) earthquakes are separated from triggered earthquakes. Clustering and

: Forecasting Losses

----------------------- Page 18-----------------------

2. Probabilistic Earthquake Forecasting 18

quake catalog to identify aftershock sequences in time and space (), parameters are estimated

using maximum likelihood techniques, i. e., by fitting the parameters to the data.

Using the parameter estimates in Eqn. (2.3), the rate of aftershocks can finally be calculated

for every magnitude bin at any future time. Assuming an inhomogeneous (i. e., time dependent)

Poisson process for aftershock occurrence, the probability of triggered events within a desired

magnitude ( ) and time ( ) range is determined by

(2.4)

Because the rates of small events are high, they are highly probable. But triggered events are

not limited to be smaller than the mainshock. With Eqn. (2.4), one will still get a certain prob-

ability for the occurrence of aftershocks that are larger than the prevalent mainshock. Latter

will be then denoted as foreshock. This makes it possible to account for the increased potential

of mainshocks following foreshocks. While this probability is usually quite low, it can accom-

modate a significant contribution to the later determined hazard (section 3.1.1), because such

events impose a relatively large impact on the hazard forecast. However, the effect of forecasted

rates of large events is generally – this means during any 24 hour period with only minor events

happening – negligible, unless a similarly large earthquake occur. At the same time, the proba-

bility of another large event is increased immediately. Here it is to be noted, that the maximum

magnitude is inherently limited by the size of existing faults in the vicinity of a sequence. Such

physical restrictions are difficult t



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now