New! Sign up for our free email newsletter.
Science News
from research organizations

A model to predict and quantify racism, sexism, and other unequal treatment

Researchers show direct connection between stereotypes and unequal treatment

Date:
September 12, 2018
Source:
University of California - Berkeley Haas School of Business
Summary:
A new paper cuts to the heart of messy social interactions with a set of computational models to quantify and predict unequal treatment.
Share:
FULL STORY

When a Starbucks employee recently called the police on two black men who asked for a bathroom key but hadn't yet ordered anything, it seemed a clear-cut case of racial bias leading directly to unfair treatment. Many outraged white customers publicly contrasted it with their years of hassle-free, purchase-free Starbucks pit stops.

But from a scientific perspective, making a direct connection between people's biases and the degree to which they treat others differently is tricky. There are thousands of ways people stereotype different social groups -- whether it's assuming an Asian student is good at math or thinking an Irish colleague would make a good drinking buddy -- and with so many variables, it's incredibly challenging to trace how someone is treated to any one particular characteristic.

"There is a tendency for people to think of stereotypes, biases, and their effects as inherently subjective. Depending on where one is standing, the responses can range from 'this is obvious' to 'don't be a snowflake,'" said Berkeley Haas Assoc. Prof. Ming Hsu. "What we found is that these subjective beliefs can be quantified and studied in ways that we take for granted in other scientific disciplines."

A new paper published this week in the Proceedings of the National Academy of Sciences cuts to the heart of messy social interactions with a set of computational models to quantify and predict unequal treatment. Hsu and post-doctoral researcher Adrianna C. Jenkins -- now an assistant professor at the University of Pennsylvania -- drew on social psychology and behavioral economics in a series of lab experiments and analyses of field work. (The paper was co-written by Berkeley researcher Pierre Karashchuk and Lusha Zhu of Peking University.)

"There's been lots of work showing that people have stereotypes and that they treat members of different social groups differently," said Jenkins, the paper's lead author. "But there's quite a bit we still don't know about how stereotypes influence people's behavior."

It's more than an academic issue: University admission officers, for example, have long struggled with how to fairly consider an applicant's race, ethnicity, or other qualities that may have presented obstacles to success. How much weight should be given, for example, to the obstacles faced by African Americans compared with those faced by Central American immigrants or women?

While these are much larger questions, Hsu said the paper's contribution is to improve how to quantify and compare different discrimination across different social groups -- a common challenge facing applied researchers.

"What was so eye-opening is that we found that variations in how people are perceived translated quantitatively into differences in how they are treated," said Hsu, who holds a dual appointment with UC Berkeley's Helen Wills Neuroscience Institute and the Neuroeconomics Lab. "This was as true in laboratory studies where subjects decided how to divide a few dollars as it was in the real-world where employers decided whom to interview for a job."

Rather than analyzing whether the stereotypes were justified, the researchers took stereotypes as a starting point and looked at how they translated into behavior with over 1,200 participants across five studies. In the first study involving the classic "Dictator Game," where a player is given $10 and asked to decide how much of it to give to a counterpart, the researchers found that people gave widely disparate amounts based on just one piece of information about the recipient (i.e., occupation, ethnicity, nationality). For example, people on average gave $5.10 to recipients described as "homeless," while those described as "lawyer" got a measly $1.70 -- even less than an "addict," who got $1.90

To look at how stereotypes about the groups drove people's choices to pay out differing amounts, the researchers drew on an established social psychology framework that categorizes all stereotypes along two dimensions: those that relate to a person's warmth (or how nice they are seen to be), and those that relate to a person's competence (or . These ratings, they found, could be used to accurately predict how much money people distributed to different groups. For example, "Irish" people were perceived as warmer but slightly less competent than "British," and received slightly more money on average.

"It turns out that, even though people are incredibly complex, these two factors were immensely predictive," Hsu says. "We found that people don't just see certain groups as warmer or nicer, but if you're warmer by X unit, you get Y dollars more." Specifically, the researchers found that disparate treatment results not just from how people perceive others, but how they see others relative to themselves. In allocating money to a partner viewed as very warm, people were reluctant to offer them less than half of the pot. Yet with a partner viewed as more competent, they were less willing to end up with a smaller share of the money than the other person. For example, people were ok with having less than an "elderly" counterpart, but not less than a "lawyer."

It's one thing to predict how people behave in carefully controlled laboratory experiments, but what about in the messy real world? To test whether their findings could be generalized to the field, Hsu and colleagues tested whether their model could predict treatment disparities in the context of two high-profile studies of discrimination. The first was a Canadian labor market study that found a huge variation in job callbacks based on the perceived race, gender, and ethnicity of the names on resumes. Hsu and colleagues found that the perceived warmth and competence of the applicants -- the stereotype based solely on their names -- could predict the likelihood that an applicant had gotten callbacks.

They tried it again with data from a U.S. study on how professors responded to mentorship requests from students with different ethnic names and found the same results.

"The way the human mind structures social information has specific, systemic, and powerful effects on how people value what happens to others," the researchers wrote. "Social stereotypes are so powerful that it's possible to predict treatment disparities based on just these two dimensions (warmth and competence)."

Hsu says the model's predictive power could be useful in a wide range of applications, such as identifying patterns of discrimination across large populations or building an algorithm that can detect and rate racism or sexism across the internet -- something these authors are deep at work on now.

"Our hope is that this scientific approach can provide a more rational, factual basis for discussions and policies on some of the most emotionally-fraught topics in today's society," Hsu said.


Story Source:

Materials provided by University of California - Berkeley Haas School of Business. Original written by Laura Counts. Note: Content may be edited for style and length.


Journal Reference:

  1. Adrianna C. Jenkins, Pierre Karashchuk, Lusha Zhu, Ming Hsu. Predicting human behavior toward members of different social groups. Proceedings of the National Academy of Sciences, 2018; 201719452 DOI: 10.1073/pnas.1719452115

Cite This Page:

University of California - Berkeley Haas School of Business. "A model to predict and quantify racism, sexism, and other unequal treatment." ScienceDaily. ScienceDaily, 12 September 2018. <www.sciencedaily.com/releases/2018/09/180912081224.htm>.
University of California - Berkeley Haas School of Business. (2018, September 12). A model to predict and quantify racism, sexism, and other unequal treatment. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2018/09/180912081224.htm
University of California - Berkeley Haas School of Business. "A model to predict and quantify racism, sexism, and other unequal treatment." ScienceDaily. www.sciencedaily.com/releases/2018/09/180912081224.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES