New! Sign up for our free email newsletter.
Science News
from research organizations

Simulating 800,000 years of California earthquake history to pinpoint risks

Supercomputer-powered framework developed by SCEC provides new view of seismic hazard

Date:
January 25, 2021
Source:
University of Texas at Austin, Texas Advanced Computing Center
Summary:
A new study presents a prototype Rate-State earthquake simulator that simulates hundreds of thousands of years of seismic history in California. Coupled with another code, the framework can calculate the amount of shaking that would occur for each quake. The new approach improves the ability to pinpoint how big an earthquake might occur in a given location, allowing building code developers and structural engineers to design more resilient buildings that can survive earthquakes.
Share:
FULL STORY

Massive earthquakes are, fortunately, rare events. But that scarcity of information blinds us in some ways to their risks, especially when it comes to determining the risk for a specific location or structure.

"We haven't observed most of the possible events that could cause large damage," explained Kevin Milner, a computer scientist and seismology researcher at the Southern California Earthquake Center (SCEC) at the University of Southern California. "Using Southern California as an example, we haven't had a truly big earthquake since 1857 -- that was the last time the southern San Andreas broke into a massive magnitude 7.9 earthquake. A San Andreas earthquake could impact a much larger area than the 1994 Northridge earthquake, and other large earthquakes can occur too. That's what we're worried about."

The traditional way of getting around this lack of data involves digging trenches to learn more about past ruptures, collating information from lots of earthquakes all around the world and creating a statistical model of hazard, or using supercomputers to simulate a specific earthquake in a specific place with a high degree of fidelity.

However, a new framework for predicting the likelihood and impact of earthquakes over an entire region, developed by a team of researchers associated with SCEC over the past decade, has found a middle ground and perhaps a better way to ascertain risk.

A new study led by Milner and Bruce Shaw of Columbia University, published in the Bulletin of the Seismological Society of America in January 2021, presents results from a prototype Rate-State earthquake simulator, or RSQSim, that simulates hundreds of thousands of years of seismic history in California. Coupled with another code, CyberShake, the framework can calculate the amount of shaking that would occur for each quake. Their results compare well with historical earthquakes and the results of other methods, and display a realistic distribution of earthquake probabilities.

According to the developers, the new approach improves the ability to pinpoint how big an earthquake might occur in a given location, allowing building code developers, architects, and structural engineers to design more resilient buildings that can survive earthquakes at a specific site.

"For the first time, we have a whole pipeline from start to finish where earthquake occurrence and ground-motion simulation are physics-based," Milner said. "It can simulate up to 100,000s of years on a really complicated fault system."

Applying massive computer power to big problems

RSQSim transforms mathematical representations of the geophysical forces at play in earthquakes -- the standard model of how ruptures nucleate and propagate -- into algorithms, and then solves them on some of the most powerful supercomputers on the planet. The computationally-intensive research was enabled over several years by government-sponsored supercomputers at the Texas Advanced Computing Center, including Frontera -- the most powerful system at any university in the world -- Blue Waters at the National Center for Supercomputing Applications, and Summit at the Oak Ridge Leadership Computing Facility.

"One way we might be able to do better in predicting risk is through physics-based modeling, by harnessing the power of systems like Frontera to run simulations," said Milner. "Instead of an empirical statistical distribution, we simulate the occurrence of earthquakes and the propagation of its waves."

"We've made a lot of progress on Frontera in determining what kind of earthquakes we can expect, on which fault, and how often," said Christine Goulet, Executive Director for Applied Science at SCEC, also involved in the work. "We don't prescribe or tell the code when the earthquakes are going to happen. We launch a simulation of hundreds of thousands of years, and just let the code transfer the stress from one fault to another."

The simulations began with the geological topography of California and simulated over 800,000 virtual years how stresses form and dissipate as tectonic forces act on the Earth. From these simulations, the framework generated a catalogue -- a record that an earthquake occurred at a certain place with a certain magnitude and attributes at a given time. The catalog that the SCEC team produced on Frontera and Blue Waters was among the largest ever made, Goulet said. The outputs of RSQSim were then fed into CyberShake that again used computer models of geophysics to predict how much shaking (in terms of ground acceleration, or velocity, and duration) would occur as a result of each quake.

"The framework outputs a full slip-time history: where a rupture occurs and how it grew," Milner explained. "We found it produces realistic ground motions, which tells us that the physics implemented in the model is working as intended." They have more work planned for validation of the results, which is critical before acceptance for design applications.

The researchers found that the RSQSim framework produces rich, variable earthquakes overall -- a sign it is producing reasonable results -- while also generating repeatable source and path effects.

"For lots of sites, the shaking hazard goes down, relative to state-of-practice estimates" Milner said. "But for a couple of sites that have special configurations of nearby faults or local geological features, like near San Bernardino, the hazard went up. We are working to better understand these results and to define approaches to verify them."

The work is helping to determine the probability of an earthquake occurring along any of California's hundreds of earthquake-producing faults, the scale of earthquake that could be expected, and how it may trigger other quakes.

Support for the project comes from the U.S. Geological Survey (USGS), National Science Foundation (NSF), and the W.M. Keck Foundation. Frontera is NSF's leadership-class national resource. Compute time on Frontera was provided through a Large-Scale Community Partnership (LSCP) award to SCEC that allows hundreds of U.S. scholars access to the machine to study many aspects of earthquake science. LSCP awards provide extended allocations of up to three years to support long-lived research efforts. SCEC -- which was founded in 1991 and has computed on TACC systems for over a decade -- is a premier example of such an effort.

The creation of the catalog required eight days of continuous computing on Frontera and used more than 3,500 processors in parallel. Simulating the ground shaking at 10 sites across California required a comparable amount of computing on Summit, the second fastest supercomputer in the world.

"Adoption by the broader community will be understandably slow," said Milner. "Because such results will impact safety, it is part of our due diligence to make sure these results are technically defensible by the broader community," added Goulet. But research results such as these are important in order to move beyond generalized building codes that in some cases may be inadequately representing the risk a region face while in other cases being too conservative.

"The hope is that these types of models will help us better characterize seismic hazard so we're spending our resources to build strong, safe, resilient buildings where they are needed the most," Milner said.

Video: https://www.youtube.com/watch?v=AdGctQsjKpU&feature=emb_logo


Story Source:

Materials provided by University of Texas at Austin, Texas Advanced Computing Center. Original written by Aaron Dubrow. Note: Content may be edited for style and length.


Journal Reference:

  1. Kevin R. Milner, Bruce E. Shaw, Christine A. Goulet, Keith B. Richards-Dinger, Scott Callaghan, Thomas H. Jordan, James H. Dieterich, Edward H. Field. Toward Physics-Based Nonergodic PSHA: A Prototype Fully Deterministic Seismic Hazard Model for Southern California. Bulletin of the Seismological Society of America, 2021; DOI: 10.1785/0120200216

Cite This Page:

University of Texas at Austin, Texas Advanced Computing Center. "Simulating 800,000 years of California earthquake history to pinpoint risks." ScienceDaily. ScienceDaily, 25 January 2021. <www.sciencedaily.com/releases/2021/01/210125144550.htm>.
University of Texas at Austin, Texas Advanced Computing Center. (2021, January 25). Simulating 800,000 years of California earthquake history to pinpoint risks. ScienceDaily. Retrieved October 31, 2024 from www.sciencedaily.com/releases/2021/01/210125144550.htm
University of Texas at Austin, Texas Advanced Computing Center. "Simulating 800,000 years of California earthquake history to pinpoint risks." ScienceDaily. www.sciencedaily.com/releases/2021/01/210125144550.htm (accessed October 31, 2024).

Explore More

from ScienceDaily

RELATED STORIES