Parkfield Earthquake Prediction Experiment | Decoded Science

Parkfield Earthquake Prediction Experiment | Decoded Science:

Earthquake monitoring continues at Parkfield: Image by Scott Haefner, courtesy USGS.

Accurate earthquake prediction, which could save many thousands of lives and minimize earthquake damage, has so far eluded seismologists. One attempt at predicting an earthquake took place during the latter part of the twentieth century in the town of Parkfield in California. Based upon a known series of earthquakes, the experiment attempted to predict a ‘window’ during which the next tremor of significant size would occur.

The Parkfield Earthquake Sequence

Parkfield lies on the San Andreas Fault in California. The fault is part of an active transform fault zone which marks the boundary between the North American tectonic plate, which is moving roughly south-westwards at a rate of around 23mm each year, and the Pacific plate, which is moving approximately north-eastwards at around 79mm each year (This Dynamic Planet map). The relative motions of these plates generates regular earthquakes, many of significant size.

The earthquake history of Parkfield is well-documented but the sequence of relevance to the experiment is best summarized in the article which originally proposed the prediction model. Looking at earthquake events where the mainshock (the largest tremor) was M6.0 or greater, WH Bakun and AG Lindh described a sequence of seismic events.

Establishing the true nature of the sequence of earthquakes in Parkfield, especially in its early part, is problematic due to unsophisticated methods of measuring and recording historic earthquakes. Bakun and Lindh looked at shocks which occurred in 1857, 1881 1901, 1922, 1934 and 1966 and which had magnitudes varying between approximately 5.0 and 6.4. These events were shown to have a mean interval of just under 22 years and their occurrence was described as ‘remarkably uniform’.

Parkfield and the Seismic Gap Theory

The seismic gap theory is, in essence, based upon the idea that strain generated by earth movement accumulates along a given section of a fault at a steady rate and that, once a certain threshold is reached the strain will be released and an earthquake will occur. The strain will then re-accumulate until the threshold is exceeded again and another tremor takes place (Yufang Rong et al).

On this basis, the risk of an earthquake is lowest immediately after an event and increases with time. In theory (and many assumptions have to be made about the rate of strain accumulation, among other things), such a pattern would be expected to produce sequences of earthquakes along a fault segment which are broadly the same in magnitude and in other characteristics and which occur with a relatively even frequency – as at Parkfield.

The Parkfield Earthquake Prediction Experiment

The Parkfield earthquake prediction experiment was the outcome of much work on the seismological data. It was Bakun and Lindh’s 1985 paper which, on the basis that the last so-called characteristic Parkfield earthquake had occurred in 1966, predicted that the next one would occur some time before 1993. The US government became involved and extensive research and monitoring was undertaken (USGS).

But despite the close attentions of a small army of seismologists, the earthquake of around M6.0 did not occur within the predicted timescale. It was not until 2004, over ten years after the latest forecast date and almost 40 years after the previous event, that Parkfield was again shaken by a significant earthquake of M6.0.

There was much debate among scientists as to whether this did in fact constitute the expected characteristic earthquake or whether its occurrence was mere coincidence. In some aspects the earthquake did behave as expected but in others it did not. Overall it has to be concluded that the Parkfield experiment was a failure – and, as one of its initiators (among others) was to conclude, it demonstrated that demonstrates that ‘reliable short-term earthquake prediction still is not achievable’ (Bakun et al, 2005).

Although the Parkfield experiment failed in its intention of accurately predicting the date and place of a significant earthquake, it was nevertheless not without its benefits. The 2004 earthquake was probably the most intensively monitored up to that date and, in the words of the USGS, ‘our understanding of the earthquake process has already been advanced through research results from Parkfield’. The Parkfield section of the San Andreas Fault continues to be the location for earthquake monitoring and study.

Sources
WH Bakun and AG Lindh. “The Parkfield, California, Earthquake Prediction Experiment.”Science. (1985)

WH Bakun et al. “Implications for prediction and hazard assessment from the 2004 Parkfield earthquake.” Nature. (2005)

Y Rong, DD Jackson and YY Kagan. “Seismic Gaps and Earthquakes.” Journal of Geophysics Research. (2003)

US Department of the Interior and US Geological Survey. This Dynamic Planet map, 1994

US Geological Survey. “California-Nevada fault map for Parkfield” on the USGS website, accessed 23 May 2011.

US Geological Survey. “The Parkfield, California, Earthquake Prediction Experiment” on the USGS website, accessed 23 May 2011.


Leave a Reply

You must be logged in to post a comment.