New! Sign up for our free email newsletter.
Science News
from research organizations

Psychologists increase understanding of how the brain perceives shades of gray

Date:
November 10, 2011
Source:
University of Pennsylvania
Summary:
Peoples' eyes, nerves and brains translate light into electrochemical signals and then into an experience of the world around them. A close look shows that even seemingly simple tasks, like keeping a stable perception of an object's color in different lighting conditions or distinguishing black and white objects, is, in fact, very challenging. By way of a novel experiment, psychologists have now provided new insight into how the brain tackles this problem.
Share:
FULL STORY

Vision is amazing because it seems so mundane. Peoples' eyes, nerves and brains translate light into electrochemical signals and then into an experience of the world around them. A close look at the physics of just the first part of this process shows that even seemingly simple tasks, like keeping a stable perception of an object's color in different lighting conditions or distinguishing black and white objects, is, in fact, very challenging.

University of Pennsylvania psychologists, by way of a novel experiment, have now provided new insight into how the brain tackles this problem.

The research was conducted by professor David H. Brainard and post-doctoral fellow Ana Radonjić, both of the Department of Psychology in Penn's School of Arts and Sciences. They collaborated with Sarah R. Allred and Alan L. Gilchrist of Rutgers University's Department of Psychology.

Their research will be published in the journal Current Biology.

The process of seeing an object begins when light reflected off that object hits the light-sensitive structures in the eye. In terms of color shade, the perception of an object's lightness depends on the object's reflectance. Objects that appear lighter reflect a larger percentage of light than those that appear darker; a white sheet of paper might reflect 90 percent of the light that hits it, while a black sheet of paper might only reflect 3 percent.

Interestingly, due to differences in illumination across a scene, the intensity of the light that comes from a surface to an observer's eye does not tell the observer about the surface's lightness. Although it might seem counterintuitive, a black sheet of paper in direct sunlight might reflect thousands of times more light into a person's eyes than a white object in the shade. To determine the shade of gray of a paper, the brain must therefore do more than measure the absolute intensity of light entering the eye.

"The amazing fact about our brains is that they deliver a perception of objects that is stable over the huge range of light that gets to our eyes. We want to know how the brain takes the amount of light that gets to the eye and turns it into a perception that depends on the object rather than on that total amount of light," Brainard said. "If the brain couldn't do that, objects wouldn't have a stable appearance, and it would be a disaster."

One of the puzzling aspects of this capability is that the range of the reflectance of objects is relatively small, especially compared to the range of light intensities in images coming from the world. In the earlier example, the white paper is only 30 times as reflective as the black paper, but the absolute amount of light that they actually reflect can vary by a much greater degree.

"Within one snapshot, the intensity of light coming of the brightest portion of the image could be a million times greater than that coming from the darkest portion," Radonjić said. "The question is how does the visual system map the huge range of intensities within a single image onto the much smaller but meaningful range of surface lightnesses."

Indeed, it is the mismatch of ranges that presents one of the fundamental perceptual challenges for the brain. If it picks the lightest part of an image as "white" and a shade 30 times as dark as "black," preserving the reflectance range, there could be a tremendous number of shades that are darker still that would be indistinguishable from each other.

One hypothesis is that the brain works around this problem by segmenting the image into separate regions of illumination, thereby reducing the range of luminance it must compare.

"If you can get all of the surfaces to be in the same region of illumination, then the reflectance range and luminance range will match, allowing the visual system to use within-region ratios to estimate surface lightness," Radonjić said.

To test whether this is indeed the mechanism at work, the researchers decided to push the limits of the visual system. They conducted an experiment where participants viewed images that, similar to real world images, had a very large range of light intensities -- as large as 10,000 to 1. Unlike natural images, however, those images did not contain any cues that would allow the visual system to segment them into separate regions of illumination.

To perform the experiment, the research team built a custom high dynamic range display. Participants were then asked to look at a 5x5 checkerboard composed of grayscale squares with random intensities spanning the 10,000 to 1 range. The participants were asked to report what shades of gray a target square looked like by selecting a match from a standardized gray scale.

If the visual system relied only on ratios to determine surface lightness, then the ratio of checkerboard intensities the participants reported should have had the same ratio as that of the black and white samples on the reflectance scale, about 100 to 1. Instead, however, the researchers found that this ratio could be as much as 50 times higher, more than 5,000 to 1.

"We're pushing this visual system beyond the limit we think it normally has to deal with, and because people can still make discernments in this situation it means that the ratio hypothesis is not the only one that's at work. Our experiment blows that out of the water," Brainard said. "What seems to happen instead is that the visual system takes that huge intensity range and maps it gracefully onto grayscale values in a way that preserves one's ability to discern between shades across high ranges of light intensities."

While the experiment doesn't reveal the actual mechanism behind the brain's ability to reconcile the mismatch in ranges, it does suggest new avenues of vision research in both psychology and biology. Further experiments may show how these discernments are made, why the eyes and brain are able to keep making them even in situations beyond what can be encountered in the real world, and how the phenomena demonstrated in this experiment operate along with other visual mechanisms for images that incorporate more of the richness of the real world.

The research was supported by the National Institutes of Health and the National Science Foundation.


Story Source:

Materials provided by University of Pennsylvania. Note: Content may be edited for style and length.


Journal Reference:

  1. Ana Radonjić, Sarah R. Allred, Alan L. Gilchrist, David H. Brainard. The Dynamic Range of Human Lightness Perception. Current Biology, 10 November 2011 DOI: 10.1016/j.cub.2011.10.013

Cite This Page:

University of Pennsylvania. "Psychologists increase understanding of how the brain perceives shades of gray." ScienceDaily. ScienceDaily, 10 November 2011. <www.sciencedaily.com/releases/2011/11/111110125733.htm>.
University of Pennsylvania. (2011, November 10). Psychologists increase understanding of how the brain perceives shades of gray. ScienceDaily. Retrieved November 13, 2024 from www.sciencedaily.com/releases/2011/11/111110125733.htm
University of Pennsylvania. "Psychologists increase understanding of how the brain perceives shades of gray." ScienceDaily. www.sciencedaily.com/releases/2011/11/111110125733.htm (accessed November 13, 2024).

Explore More

from ScienceDaily

MORE COVERAGE

RELATED STORIES