New! Sign up for our free email newsletter.
Science News
from research organizations

In visual searches, computer is no match for the human brain

Date:
July 17, 2012
Source:
University of California - Santa Barbara
Summary:
You're headed out the door and you realize you don't have your car keys. After a few minutes of rifling through pockets, checking the seat cushions and scanning the coffee table, you find the familiar key ring and off you go. Easy enough, right? What you might not know is that the task that took you a couple seconds to complete is a task that computers -- despite decades of advancement and intricate calculations -- still can't perform as efficiently as humans: the visual search.
Share:
FULL STORY

You're headed out the door and you realize you don't have your car keys. After a few minutes of rifling through pockets, checking the seat cushions and scanning the coffee table, you find the familiar key ring and off you go. Easy enough, right? What you might not know is that the task that took you a couple seconds to complete is a task that computers -- despite decades of advancement and intricate calculations -- still can't perform as efficiently as humans: the visual search.

"Our daily lives are composed of little searches that are constantly changing, depending on what we need to do," said Miguel Eckstein, UC Santa Barbara professor of psychological and brain sciences and co-author of the recently released paper "Feature-Independent Neural Coding of Target Detection during Search of Natural Scenes," published in the Journal of Neuroscience. "So the idea is, where does that take place in the brain?"

A large part of the human brain is dedicated to vision, with different parts involved in processing the many visual properties of the world. Some parts are stimulated by color, others by motion, yet others by shape.

However, those parts of the brain tell only a part of the story. What Eckstein and co-authors wanted to determine was how we decide whether the target object we are looking for is actually in the scene, how difficult the search is, and how we know we've found what we wanted.

They found their answers in the dorsal frontoparietal network, a region of the brain that roughly corresponds to the top of one's head, and is also associated with properties such as attention and eye movements. In the parts of the human brain used earlier in the processing stream, regions stimulated by specific features like color, motion, and direction are a major part of the search. However, in the dorsal frontoparietal network, activity is not confined to any specific features of the object.

"It's flexible," said Eckstein. Using 18 observers, an MRI machine, and hundreds of photos of scenes flashed before the observers with instructions to look for certain items, the scientists monitored their subjects' brain activity. By watching the intraparietal sulcus (IPS), located within the dorsal frontoparietal network, the researchers were able to note not only whether their subjects found the objects, but also how confident they were in their finds.

The IPS region would be stimulated even if the object was not there, said Eckstein, but the pattern of activity would not be the same as it would had the object actually existed in the scene. The pattern of activity was consistent, even though the 368 different objects the subjects searched for were defined by very different visual features. This, Eckstein said, indicates that IPS did not rely on the presence of any fixed feature to determine the presence or absence of various objects. Other visual regions did not show this consistent pattern of activity across objects.

"As you go further up in processing, the neurons are less interested in a specific feature, but they're more interested in whatever is behaviorally relevant to you at the moment," said Eckstein. Thus, a search for an apple, for instance, would make red, green, and rounded shapes relevant. If the search was for your car keys, the interparietal sulcus would now be interested in gold, silver, and key-type shapes and not interested in green, red, and rounded shapes.

"For visual search to be efficient, we want those visual features related to what we are looking for to elicit strong responses in our brain and not others that are not related to our search, and are distracting," Eckstein added. "Our results suggest that this is what is achieved in the intraparietal sulcus, and allows for efficient visual search."

For Eckstein and colleagues, these findings are just the tip of the iceberg. Future research will dig more deeply into the seemingly simple yet essential ability of humans to do a visual search and how they can use the layout of a scene to guide their search.

"What we're trying to really understand is what other mechanisms or strategies the brain has to make searches efficient and easy," said Eckstein. "What part of the brain is doing that?"

Research on this study was also conducted by Tim Preston, Koel Das, Barry Giesbrecht, and first author Fei Guo, all from UC Santa Barbara.


Story Source:

Materials provided by University of California - Santa Barbara. Note: Content may be edited for style and length.


Journal Reference:

  1. Fei Guo, Tim J. Preston, Koel Das, Barry Giesbrecht, and Miguel P. Eckstein. Feature-Independent Neural Coding of Target Detection during Search of Natural Scenes. Journal of Neuroscience, 2012; 32 (28): 9499-9510 DOI: 10.1523/JNEUROSCI.5876-11.2012

Cite This Page:

University of California - Santa Barbara. "In visual searches, computer is no match for the human brain." ScienceDaily. ScienceDaily, 17 July 2012. <www.sciencedaily.com/releases/2012/07/120717100315.htm>.
University of California - Santa Barbara. (2012, July 17). In visual searches, computer is no match for the human brain. ScienceDaily. Retrieved December 19, 2024 from www.sciencedaily.com/releases/2012/07/120717100315.htm
University of California - Santa Barbara. "In visual searches, computer is no match for the human brain." ScienceDaily. www.sciencedaily.com/releases/2012/07/120717100315.htm (accessed December 19, 2024).

Explore More

from ScienceDaily

RELATED STORIES