New! Sign up for our free email newsletter.
Science News
from research organizations

Missing the radiological forest for the trees

Clear abnormalities missed even by experienced radiologists

Date:
November 18, 2020
Source:
University of Utah
Summary:
Even experienced radiologists, when looking for one abnormality, can completely miss another. The results show that inattentional blindness can befall even experts.
Share:
FULL STORY

There's a classic video demonstrating how our brains process information and allocate attention in which people bounce and pass basketballs and the viewer is asked to count the passes.

If you haven't seen it, go watch it here and then come back. Go ahead. I'll wait.

The experiment highlights a phenomenon called inattentional blindness. We can't pay attention to everything at once, so our brains have to filter information. In the situation in the video, the stakes were low. But what if inattentional blindness causes a radiologist, for example, to miss something obvious and serious?

A study from University of Utah researchers Lauren Williams, Trafton Drew and colleagues finds that even experienced radiologists, when looking for one abnormality, can completely miss another. The results, published in Psychonomic Bulletin & Review, show that inattentional blindness can befall even experts.

"Inattentional blindness reveals the limits of human cognition," says Williams, a recent U graduate and now a postdoctoral scholar at the University of California, San Diego, "and this research demonstrates that even highly trained experts are bound by the same machinery as everyone else."

"If even these experts miss these seemingly obvious findings," adds Drew, associate professor of psychology, "it suggests that this is something really critical we need to understand about how all of us perceive the world."

Missing the gorilla

By some estimates medical errors, including missed radiological abnormalities, are the third leading cause of death in the United States. "We've known for a long time that many errors in radiology are retrospectively visible," Drew says. "This means if something goes wrong with a patient, you can often go back to the imaging for that patient and see that there were visible signs -- say, a lung nodule -- on something like a chest CT."

So, in 2013, Drew and colleagues conducted an experiment to understand how trained experts could miss those clear signs. In that study, the authors presented radiologists with chest computed tomography (CT) scans and asked them to look for lung cancer nodules. But the authors had also placed an image of a gorilla into the scan -- something that obviously doesn't belong in a lung. Drew and his colleagues found that 83% of radiologists did not notice the gorilla.

But that's a gorilla. How would the results be different, they wondered, if instead of a gorilla it was an abnormality that could plausibly come up on a CT scan?

So Williams, Drew and their colleagues from UCLA and Macquarie University set up another experiment. They asked 50 radiologists to evaluate seven chest CT scans for lung cancer, but this time the final scan included two clear abnormalities: a significant breast mass and a lymphadenopathy (an abnormal lymph node). Two-thirds of the radiologists did not notice the potentially cancerous mass. A third did not notice the lymphadenopathy.

"Like anyone that experiences inattentional blindness, I think many radiologists were simply surprised to learn they had missed something," says Williams, who administered the experiment. "Our intuition tells us that if something is fully visible, we'll detect it, but we've all experienced the feeling of missing important information that is retrospectively obvious when our attention is focused elsewhere."

Experience wasn't a factor in whether or not the radiologists noticed the abnormalities, the researchers found, suggesting that years of experience doesn't outweigh universal cognitive truths, and that missing the abnormalities isn't a reflection on the competence or skill of the radiologist.

"It suggests that understanding the situation that led to the missed abnormality may be far more important than focusing on the experience of the individuals that missed it," Williams says.

Seeing the gorilla

In a subsequent experiment, however, instead of asking the radiologists to look for lung cancer nodules, the researchers asked radiologists to look at the same scan and report on a broader range of abnormalities. This time, only 3% missed the mass and 10% missed the lymphadenopathy.

"There a huge amount of information in the ever-growing amount of data we gather on each patient, and what we actually notice depends very strongly on what you are looking for," Drew says. "Cataloging how often radiologists miss something in plain sight misses a really important piece of the puzzle: What were they looking for when they missed the thing in plain sight?"

"Our research demonstrates that focusing narrowly on one task may cause radiologists to miss unexpected abnormalities, even if those abnormalities are critical for patient outcomes," Williams adds. "However, focused attention is probably beneficial when the abnormalities match the radiologist's expectations." Any changes to clinical process would need to find the balance between the two, she says. Some possibilities might be a general assessment of a scan before looking for specific abnormalities, or using checklists to scan for commonly missed findings.

Would the use of artificial intelligence, which doesn't have the same cognitive limitations that humans do, resolve the problem of inattentional blindness? Not necessarily, Drew says. AI is only as good as its training and programming. Algorithms are good at finding narrowly defined abnormalities, he says, but not as good at detecting all possible findings on a scan.

"Radiologists might benefit from being thoughtful about what they are looking for rather assuming that if they see it they will perceive it," Drew says. "AI has in some ways, the same limitation: it's only going to be good at detecting what it has been taught to detect."

Williams says that advancements in radiological technology have produced increasingly clear medical imaging. "However, if radiologists frequently miss a large, clearly visible abnormality when their attention is focused on another task, it suggests that having a clear image is not enough."

Drew says the study can help us understand how we often find only what we're looking for.

"Everyone, even experts, can miss things that seem really obvious if we are not looking for them," he says. "If you've searched through your whole apartment for your phone, you might assume you would have noticed your keys during that search. Our research suggests a reason why you will probably have to search again specifically for the keys."


Story Source:

Materials provided by University of Utah. Original written by Paul Gabrielsen. Note: Content may be edited for style and length.


Journal Reference:

  1. Lauren Williams, Ann Carrigan, William Auffermann, Megan Mills, Anina Rich, Joann Elmore, Trafton Drew. The invisible breast cancer: Experience does not protect against inattentional blindness to clinically relevant findings in radiology. Psychonomic Bulletin & Review, 2020; DOI: 10.3758/s13423-020-01826-4

Cite This Page:

University of Utah. "Missing the radiological forest for the trees." ScienceDaily. ScienceDaily, 18 November 2020. <www.sciencedaily.com/releases/2020/11/201118225056.htm>.
University of Utah. (2020, November 18). Missing the radiological forest for the trees. ScienceDaily. Retrieved December 23, 2024 from www.sciencedaily.com/releases/2020/11/201118225056.htm
University of Utah. "Missing the radiological forest for the trees." ScienceDaily. www.sciencedaily.com/releases/2020/11/201118225056.htm (accessed December 23, 2024).

Explore More

from ScienceDaily

RELATED STORIES