New! Sign up for our free email newsletter.
Science News
from research organizations

The problem with solving problems

'Prevalence induced concept change' causes people to re-define problems as they are reduced

Date:
June 28, 2018
Source:
Harvard University
Summary:
As demonstrated in a series of new studies, researchers show that as the prevalence of a problem is reduced, humans are naturally inclined to redefine the problem itself. The result is that as a problem becomes smaller, people's conceptualizations of that problem become larger, which can lead them to miss the fact that they've solved it.
Share:
FULL STORY

Although it's far from perfect by virtually any measure -- whether poverty rates, violence, access to education, racism and prejudice or any number of others -- the world continues to improve. Why, then, do polls consistently show that people believe otherwise?

The answer, Daniel Gilbert says, may lie in a phenomenon called "prevalence induced concept change."

As demonstrated in a series of new studies, Gilbert, the Edgar Pierce Professor of Psychology, his post-doctoral student David Levari, and several other colleagues, show that as the prevalence of a problem is reduced, humans are naturally inclined to redefine the problem itself. The result is that as a problem becomes smaller, people's conceptualizations of that problem become larger, which can lead them to miss the fact that they've solved it. The studies are described in a paper in the June 29th issue of Science.

"Our studies show that people judge each new instance of a concept in the context of the previous instances," Gilbert said. "So as we reduce the prevalence of a problem, such as discrimination for example, we judge each new behavior in the improved context that we have created."

"Another way to say this is that solving problems causes us to expand our definitions of them," he said. "When problems become rare, we count more things as problems. Our studies suggest that when the world gets better, we become harsher critics of it, and this can cause us to mistakenly conclude that it hasn't actually gotten better at all. Progress, it seems, tends to mask itself."

The phenomenon isn't limited to large, seemingly intractable social issues, Gilbert said. In several experiments described in the paper, it emerged even when participants were asked to look for blue dots.

"We had volunteers look at thousands of dots on a computer screen one at a time and decide if each was or was not blue," Gilbert said. "When we lowered the prevalence of blue dots, and what we found was that our participants began to classify as blue dots they had previously classified as purple."

Even when participants were warned to be on the lookout for the phenomenon, and even when they were offered money not to let it happen, the results showed they continued to alter their definitions of blue.

Another experiment showed similar results using faces. When the prevalence of threatening faces was reduced, people began to identify neutral faces as threatening.

Perhaps the most socially relevant of the studies described in the paper, Gilbert said, involved participants acting as members of an institutional review board, the committee that reviews research methodology to ensure that scientific studies are ethical.

"We asked participants to review proposals for studies that varied from highly ethical to highly unethical," he said. "Over time, we lowered the prevalence of unethical studies, and sure enough, when we did that, our participants started to identify innocuous studies as unethical."

In some cases, Gilbert said, prevalence-induced concept change makes perfect sense, as in the case of an emergency room doctor trying to triage patients.

"If the ER is full of gunshot victims and someone comes in with a broken arm, the doctor will tell that person to wait," he said. "But imagine one Sunday where there are no gunshot victims. Should that doctor hold her definition of "needing immediate attention" constant and tell the guy with the broken arm to wait anyway? Of course not! She should change her definition based on this new context."

In other cases, however, prevalence-induced concept change can be a problem.

"Nobody thinks a radiologist should change his definition of what constitutes a tumor and continue to find them even when they're gone," Gilbert said. "That's a case in which you really must be able to know when your work is done. You should be able to see that the prevalence of tumors has gone to zero and call it a day. Our studies simply suggest that this isn't an easy thing to do. Our definitions of concepts seem to expand whether we want them to or not."

Aside from the obvious questions it raises about how we might go about fixing problems both large and small, the studies also point to issues of how we talk about addressing those problems.

"Expanding one's definition of a problem may be seen by some as evidence of political correctness run amuck," Gilbert said. "They will argue that reducing the prevalence of discrimination, for example, will simply cause us to start calling more behaviors discriminatory. Others will see the expansion of concepts as an increase in social sensitivity, as we become aware of problems that we previously failed to recognize."

"Our studies take no position on this," he added. "There are clearly times in life when our definitions should be held constant, and there are clearly times when they should be expanded. Our experiments simply show that when we are in the former circumstance, we often act as though we are in the latter."

Ultimately, Gilbert said, these studies suggests that there may be a need for institutional mechanisms to guard against the prevalence-induced concept change.

"Anyone whose job involves reducing the prevalence of something should know that it isn't always easy to tell when their work is done," he said. "On the other hand, our studies suggest that simply being aware of this problem is not sufficient to prevent it. What can prevent it? No one yet knows. That's what the phrase 'more research is needed' was invented for."


Story Source:

Materials provided by Harvard University. Note: Content may be edited for style and length.


Journal Reference:

  1. David E. Levari, Daniel T. Gilbert, Timothy D. Wilson, Beau Sievers, David M. Amodio, Thalia Wheatley. Prevalence-induced concept change in human judgment. Science, 2018; 360 (6396): 1465 DOI: 10.1126/science.aap8731

Cite This Page:

Harvard University. "The problem with solving problems." ScienceDaily. ScienceDaily, 28 June 2018. <www.sciencedaily.com/releases/2018/06/180628151752.htm>.
Harvard University. (2018, June 28). The problem with solving problems. ScienceDaily. Retrieved December 23, 2024 from www.sciencedaily.com/releases/2018/06/180628151752.htm
Harvard University. "The problem with solving problems." ScienceDaily. www.sciencedaily.com/releases/2018/06/180628151752.htm (accessed December 23, 2024).

Explore More

from ScienceDaily

RELATED STORIES