How the brain perceives shades of gray
- Date:
- November 15, 2011
- Source:
- Rutgers University
- Summary:
- Groundbreaking research provides new insight into how the brain perceives color.
- Share:
How the brain perceives color is one of its more impressive tricks. It is able to keep a stable perception of an object's color as lighting conditions change.
Sarah Allred, an assistant professor of psychology at Rutgers-Camden, has teamed up with psychologists from the University of Pennsylvania on groundbreaking research that provides new insight into how this works.
Allred conducted the research with Alan L. Gilchrist, a professor of psychology at Rutgers-Newark, and professor David H. Brainard and post-doctoral fellow Ana Radonjic, both of the University of Pennsylvania. Their research will be published in the journal Current Biology.
"Although we recognize easily the colors of objects in many different environments, this is a difficult problem for the brain," Allred says. "For example, consider just the gray scale that goes from black to white. A white piece of paper in bright sunlight reflects thousands of times more light to the eye than a white piece of paper indoors, but both pieces of paper look white. How does the brain do this?"
The process of seeing an object begins when light reflected off that object hits the light-sensitive structures in the eye. The perception of an object's lightness (in terms of color shade) depends on the object's reflectance. Objects that appear lighter reflect a larger percentage of light than those that appear darker.
Allred says the brain processes perceptual differences between black and white objects even when illumination of the object changes. If the brain did not do this, it would fail to distinguish color shade in different light.
In general, white objects reflect about 90 percent of the light that hits them, and black objects reflect about three percent, a ratio of 30-to-1, she explains.
"However, if you look at the intensities of light that enter the eye from a typical scene, like a field of lilies, that ratio is much higher, usually somewhere between 10,000-to-1 and a million-to-1," Allred says.
This happens because in addition to having objects with different reflectance, real "scenes" also have different levels of illumination. One example might be a shadowed area under a tree. Allred and her research colleagues wanted to determine how the brain maps a large range of light intensity onto a much smaller reflectance range.
One long-time hypothesis is that the brain segments scenes into different regions of illumination and then uses ratio coding to decide what looks white.
To test if this hypothesis was true, the researchers conducted an experiment where participants viewed images that had a very large range of light intensities. Participants were asked to look at a 5x5 checkerboard composed of grayscale squares with random intensities spanning the 10,000-to-1 range. They were asked to report what shades of gray a target square looked like by selecting a match from a standardized gray scale.
If the visual system relied only on ratios to determine surface lightness, then the ratio of checkerboard intensities the participants reported should have had the same ratio as that of the black and white samples on the reflectance scale, about 100-to-1.
Instead, the researchers found that this ratio could be as much as 50 times higher, more than 5,000-to-1.
"This research is important because we have falsified the ratio hypothesis, which is currently the most widely invoked explanation of how we perceive lightness," Allred says. "We also were able to reject several similar models of lightness. We were able to do this because we measured lightness in such high-range and relatively complex images."
She continues, "In addition, even though we used behavioral rather than physiological measures, our results provide insight into the neural mechanisms that must underlie the behavioral results."
A Philadelphia resident, Allred received her undergraduate degree from Brigham Young University and her graduate degree from the University of Washington. She is also conducting research on color memory and perception through a five-year grant from the National Science Foundation.
Story Source:
Materials provided by Rutgers University. Note: Content may be edited for style and length.
Journal Reference:
- Ana Radonjić, Sarah R. Allred, Alan L. Gilchrist, David H. Brainard. The Dynamic Range of Human Lightness Perception. Current Biology, 2011; DOI: 10.1016/j.cub.2011.10.013
Cite This Page: