New! Sign up for our free email newsletter.
Science News
from research organizations

Envisioning safer cities with AI

Researchers use crowdsourced data, neural networks, and supercomputers to simulate risks to cities and regions

Date:
May 19, 2021
Source:
University of Texas at Austin, Texas Advanced Computing Center
Summary:
Researchers developed a suite of AI tools that can automatically identify characteristics of every buildings in a city and compute the risks they would face during a natural hazard event. The team tested the tools with simulated earthquakes in San Francisco; and hurricanes in Lake Charles, Louisiana, the Texas coast, and Atlantic City, New Jersey. The simulations generated realistic spatial distributions of buildings and identified some building characteristics with 100% accuracy.
Share:
FULL STORY

Artificial intelligence is providing new opportunities in a range of fields, from business to industrial design to entertainment. But how about civil engineering and city planning? How might machine- and deep-learning help us create safer, more sustainable, and resilient built environments?

A team of researchers from the NSF NHERI SimCenter, a computational modeling and simulation center for the natural hazards engineering community based at the University of California, Berkeley, have developed a suite of tools called BRAILS -- Building Recognition using AI at Large-Scale -- that can automatically identify characteristics of buildings in a city and even detect the risks that a city's structures would face in an earthquake, hurricane, or tsunami.

Charles (Chaofeng) Wang, a postdoctoral researcher at the University of California, Berkeley, and the lead developer of BRAILS, says the project grew out of a need to quickly and reliably characterize the structures in a city.

"We want to simulate the impact of hazards on all of the buildings in a region, but we don't have a description of the building attributes," Wang said. "For example, in the San Francisco Bay area, there are millions of buildings. Using AI, we are able to get the needed information. We can train neural network models to infer building information from images and other sources of data."

BRAILS uses machine learning, deep learning, and computer vision to extract information about the built environment. It is envisioned as a tool for architects, engineers and planning professionals to more efficiently plan, design, and manage buildings and infrastructure systems.

The SimCenter recently released BRAILS version 2.0 which includes modules to predict a larger spectrum of building characteristics. These include occupancy class (commercial, single-family, or multi-family), roof type (flat, gabled, or hipped), foundation elevation, year built, number of floors, and whether a building has a "soft-story" -- a civil engineering term for structures that include ground floors with large openings (like storefronts) that may be more prone to collapse during an earthquake.

The basic BRAILS framework developed by Wang and his collaborators automatically extracts building information from satellite and ground level images drawn from Google Maps and merges these with data from several sources, such as Microsoft Footprint Data and OpenStreetMap -- a collaborative project to create a free editable map of the world. The framework also provides the option to fuse this data with tax records, city surveys, and other information, to complement the computer vision component.

"Given the importance of regional simulations and the need for large inventory data to execute these, machine learning is really the only option for making progress," noted SimCenter Principal Investigator and co-Director Sanjay Govindjee. "It is exciting to see civil engineers learning these new technologies and applying them to real world problems."

LEVERAGING CROWDSOURCING POWER

Recently, the SimCenter launched a project on the citizen science web portal, Zooniverse, to collect additional labelled data. The project, called "Building Detective for Disaster Preparedness," enables the public to identify specific architectural features of structures, like roofs, windows, and chimneys. These labels will be used to train additional feature extraction modules.

"We launched the Zooniverse project in March and within a couple of weeks we had a thousand volunteers, and 20,000 images annotated," Wang said.

Since no data source is complete or fully accurate, BRAILS performs data enhancements using logical and statistical methods to fill in gaps. It also computes the uncertainty for its estimates.

After developing and testing the accuracy of these modules individually, the team combined them to create the CityBuilder tool inside BRAILS. Inputting a given city or region into CityBuilder can automatically generate a characterization of every structure in that geographic area.

Wang and his collaborators performed a series of validation demonstrations, or as they call them, testbeds, to determine the accuracy of the AI-derived models. Each testbed generates an inventory of structures and simulates the impact of a hazard based on historical or plausible events.

The team has created testbeds for earthquakes in San Francisco; and hurricanes in Lake Charles, Louisiana, the Texas coast, and Atlantic City, New Jersey.

"Our objectives are two-fold," Wang said. "First, to mitigate the damage in the future by doing simulations and providing results to decision- and policy-makers. And second, to use this data to quickly simulate a real scenario -- instantly following a new event, before the reconnaissance team is deployed. We hope near-real-time simulation results can help guide emergency response with greater accuracy."

The team outlined their framework in the February 2021 issue of Automation in Construction. They showed that their neural network could generate realistic spatial distributions of buildings in a region and described how it could be used for large-scale natural hazard risk management using five coastal cities in New Jersey.

The team presented a testbed for Hurricane Laura (2020), the strongest hurricane to make landfall in Louisiana, at the 2021 Workshop on SHared Operational REsearch Logistics In the Nearshore Environment (SHORELINE21).

"For some models, like occupancy, we are seeing the accuracy is close to 100%," Wang said when asked about the performance of BRAILS. "For other modules, like roof type, we're seeing 90% accuracy."

COMPUTATIONAL RESOURCES

To train the BRAILS modules and run the simulations, the researchers used supercomputers at the Texas Advanced Computing Center (TACC) -- notably Frontera, the fastest academic supercomputer in the world, and Maverick 2, a GPU-based system designed for deep learning.

"For one model, the training could be finished in a few hours, but this depends on the number of images, the number of GPUs, the learning rate, etc.," Wang explained.

TACC, like the SimCenter, is a funded partner in the NSF NHERI program. TACC designed and maintains the DesignSafe-CI (Cyberinfrastructure) -- a platform for computation, data analysis, and tools used by natural hazard researchers.

"This project is a great example of how advanced computing through DesignSafe can enable new avenues of natural hazards research and new tools, with many components of NHERI working together," said Ellen Rathje, professor of civil engineering at The University of Texas at Austin and principal investigator of the DesignSafe project.

BRAILS/CityBuilder is designed to work seamlessly with the SimCenter Regional Resilience Determination (R2D) tool. R2D is a graphical user interface for the SimCenter application framework for quantifying the regional impact from natural hazards. Its outputs include the damage state and the loss ratio -- the percentage of a building's repair cost to its replacement value -- of each building across an entire city or region, and the degree of confidence in the prediction.

"The hazard event simulations -- applying wind fields or ground shaking to thousands or millions of buildings to assess the impact of a hurricane or earthquake -- requires a lot of computing resources and time," Wang said. "For one city-wide simulation, depending on the size, it typically takes hours to run on TACC."

TACC is an ideal environment for this research, Wang says. It provides most of the computation his team needs. "Working on NSF projects related to DesignSafe, I can compute almost without limitations. It's awesome."

IMPACTS

To make our communities more resilient to natural hazards, we need to know what level of damage we will have in the future, to inform residents and policymakers about whether to strengthen buildings or move people to other places.

"That's what the simulation and modeling can provide, " Wang said. "All to create a more resilient built environment."


Story Source:

Materials provided by University of Texas at Austin, Texas Advanced Computing Center. Original written by Aaron Dubrow. Note: Content may be edited for style and length.


Journal Reference:

  1. Chaofeng Wang, Qian Yu, Kincho H. Law, Frank McKenna, Stella X. Yu, Ertugrul Taciroglu, Adam Zsarnóczay, Wael Elhaddad, Barbaros Cetiner. Machine learning-based regional scale intelligent modeling of building information for natural hazard risk management. Automation in Construction, 2021; 122: 103474 DOI: 10.1016/j.autcon.2020.103474

Cite This Page:

University of Texas at Austin, Texas Advanced Computing Center. "Envisioning safer cities with AI." ScienceDaily. ScienceDaily, 19 May 2021. <www.sciencedaily.com/releases/2021/05/210519120858.htm>.
University of Texas at Austin, Texas Advanced Computing Center. (2021, May 19). Envisioning safer cities with AI. ScienceDaily. Retrieved December 20, 2024 from www.sciencedaily.com/releases/2021/05/210519120858.htm
University of Texas at Austin, Texas Advanced Computing Center. "Envisioning safer cities with AI." ScienceDaily. www.sciencedaily.com/releases/2021/05/210519120858.htm (accessed December 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES