New! Sign up for our free email newsletter.
Science News
from research organizations

Police stop fewer black drivers at night when a 'veil of darkness' obscures their race

Study also finds that when drivers were pulled over, officers searched the cars of blacks and Hispanics more often than whites

Date:
May 7, 2020
Source:
Stanford School of Engineering
Summary:
After analyzing 95 million traffic stop records, filed by officers with 21 state patrol agencies and 35 municipal police forces from 2011 to 2018, researchers concluded that 'police stops and search decisions suffer from persistent racial bias.'
Share:
FULL STORY

The largest-ever study of alleged racial profiling during traffic stops has found that blacks, who are pulled over more frequently than whites by day, are much less likely to be stopped after sunset, when "a veil of darkness" masks their race.

That is one of several examples of systematic bias that emerged from a five-year study that analyzed 95 million traffic stop records, filed by officers with 21 state patrol agencies and 35 municipal police forces from 2011 to 2018.

The Stanford-led study also found that when drivers were pulled over, officers searched the cars of blacks and Hispanics more often than whites. The researchers also examined a subset of data from Washington and Colorado, two states that legalized marijuana, and found that while this change resulted in fewer searches overall, and thus fewer searches of blacks and Hispanics, minorities were still more likely than whites to have their cars searched after a pull-over.

"Our results indicate that police stops and search decisions suffer from persistent racial bias, and point to the value of policy interventions to mitigate these disparities," the researchers write in the May 4th issue of Nature Human Behaviour.

The paper culminates a five-year collaboration between Stanford's Cheryl Phillips, a journalism lecturer whose graduate students obtained the raw data through public records requests, and Sharad Goel, a professor of management science and engineering whose computer science team organized and analyzed the data.

Goel and his collaborators, which included Ravi Shroff, a professor of applied statistics at New York University, spent years culling through the data, eliminating records that were incomplete or from the wrong time periods, to create the 95 million-record database that was the basis for their analysis. "There is no way to overstate the difficulty of that task," Goel said.

Creating that database enabled the team to find the statistical evidence that a "veil of darkness" partially immunized blacks against traffic stops. That term and idea has been around since 2006 when it was used in a study that compared the race of 8,000 drivers in Oakland, California, who were stopped at any time of day or night over a six month period. But the findings from that study were inconclusive because the sample was too small to prove a link between the darkness of the sky and the race of the stopped drivers.

The Stanford team decided to repeat the analysis using the much larger dataset that they had gathered. First, they narrowed the range of variables they had to analyze by choosing a specific time of day -- around 7 p.m. -- when the probable causes for a stop were more or less constant. Next, they took advantage of the fact that, in the months before and after daylight saving time each year, the sky gets a little darker or lighter, day by day. Because they had such a massive database, the researchers were able to find 113,000 traffic stops, from all of the locations in their database, that occurred on those days, before or after clocks sprang forward or fell back, when the sky was growing darker or lighter at around 7 p.m. local time.

This dataset provided a statistically valid sample with two important variables -- the race of the driver being stopped, and the darkness of the sky at around 7 p.m. The analysis left no doubt that the darker it got, the less likely it became that a black driver would be stopped. The reverse was true when the sky was lighter.

More than any single finding, the collaboration's most lasting impact may be from the Stanford Open Policing Project, which the researchers started to make their data available to investigative and data-savvy reporters, and to hold workshops to help reporters learn how to use the data to do local stories.

For example, the researchers helped reporters at the Seattle-based non-profit news organization, Investigate West, understand the patterns in the data for stories showing bias in police searches of Native Americans. That reporting prompted the Washington State Patrol to review its practices and boost officer training. Similarly, the researchers helped reporters at the Los Angeles Times analyze data that showed how police searched minority drivers far more often than whites. It resulted in a story that was part of a larger investigative series that prompted changes in Los Angeles Police Department practices.

"All told we've trained about 200 journalists, which is one of the unique things about this project," Phillips said.

Goel and Phillips plan to continue collaborating through a project called Big Local News that will explore how data science can shed light on public issues, such as civil asset forfeitures -- instances in which law enforcement is authorized to seize and sell property associated with a crime. Gathering and analyzing records of when and where such seizures occur, to whom, and how such property is disposed will help shed light on how this practice is being used. Big Local News is also working on collaborative efforts to standardize information from police disciplinary cases.

"These projects demonstrate the power of combining data science with journalism to tell important stories," Goel said.

Other authors include current or former Stanford graduate students or research assistants Emma Pierson, Camelia Simoiu, Jan Overgoor, Sam Corbett-Davies, Daniel Jenson, Amy Shoemaker, Vignesh Ramachandran, and Phoebe Barghouty.

This work was supported in part by the John S. and James L. Knight Foundation and by the Hellman Foundation.


Story Source:

Materials provided by Stanford School of Engineering. Original written by Tom Abate. Note: Content may be edited for style and length.


Journal Reference:

  1. Emma Pierson, Camelia Simoiu, Jan Overgoor, Sam Corbett-Davies, Daniel Jenson, Amy Shoemaker, Vignesh Ramachandran, Phoebe Barghouty, Cheryl Phillips, Ravi Shroff, Sharad Goel. A large-scale analysis of racial disparities in police stops across the United States. Nature Human Behaviour, 2020; DOI: 10.1038/s41562-020-0858-1

Cite This Page:

Stanford School of Engineering. "Police stop fewer black drivers at night when a 'veil of darkness' obscures their race." ScienceDaily. ScienceDaily, 7 May 2020. <www.sciencedaily.com/releases/2020/05/200507094621.htm>.
Stanford School of Engineering. (2020, May 7). Police stop fewer black drivers at night when a 'veil of darkness' obscures their race. ScienceDaily. Retrieved December 21, 2024 from www.sciencedaily.com/releases/2020/05/200507094621.htm
Stanford School of Engineering. "Police stop fewer black drivers at night when a 'veil of darkness' obscures their race." ScienceDaily. www.sciencedaily.com/releases/2020/05/200507094621.htm (accessed December 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES