New! Sign up for our free email newsletter.
Science News
from research organizations

Brain Imaging Study Sheds Light On Moral Decision-Making

Date:
September 14, 2001
Source:
Princeton University
Summary:
In a study that combines philosophy and neuroscience, researchers have begun to explain how emotional reactions and logical thinking interact in moral decision-making. Princeton University researchers reported in the Sept. 14 issue of Science that they used functional magnetic resonance imaging (fMRI) to analyze brain activity in people who were asked to ponder a range of moral dilemmas.
Share:
FULL STORY

In a study that combines philosophy and neuroscience, researchers have begun to explain how emotional reactions and logical thinking interact in moral decision-making.

Princeton University researchers reported in the Sept. 14 issue of Science that they used functional magnetic resonance imaging (fMRI) to analyze brain activity in people who were asked to ponder a range of moral dilemmas.

The results suggest that, while people regularly reach the same conclusions when faced with uncomfortable moral choices, their answers often do not grow out of the reasoned application of general moral principles. Instead, they draw on emotional reactions, particularly for certain kinds of moral dilemmas.

The results also show how tools of neuroscience are beginning to reveal the biological underpinnings of the subtlest elements of human behavior, said Joshua Greene, a graduate student in philosophy who conducted the study in collaboration with scientists in the psychology department and the Center for the Study of Brain, Mind and Behavior.

"We think of moral judgments as so ethereal," said Greene. "Now we're in a position to start looking at brain anatomy and understanding how neural mechanisms produce patterns in our behavior."

The study focused on a classic set of problems that have fascinated moral philosophers for years because of the difficulty in identifying moral principles that agree with the way people react.

One dilemma, known as the trolley problem, involves a runaway train that is about to kill five people. The question is whether it is appropriate for a bystander to throw a switch and divert the trolley onto a spur on which it will kill one person and allow the five to survive.

Philosophers compare this problem to a second scenario, sometimes called the footbridge problem, in which a train is again heading toward five people, but there is no spur. Two bystanders are on a bridge above the tracks and the only way to save the five people is for one bystander to push the other in front of the train, killing the fallen bystander.

Both cases involve killing one person to save five, but they evoke very different responses. People tend to agree that it is permissible to flip the switch, but not to push a person off the bridge. People in the study also followed this pattern. This distinction has puzzled philosophers who have not been able to find a hard and fast rule to explain why one is right and the other wrong. For each potential principle, there seems to be another scenario that undermines it.

One reason for the difficulty, said Greene, appears to be that the two problems engage different psychological processes -- some more emotional, some less so -- that rely on different areas of the brain.

"They're very similar problems -- they seem like they are off the same page -- but we appear to approach them in very different ways," said Greene.

Greene emphasized that the researchers were not trying to answer questions about what is right or wrong. Instead, given that people follow a pattern of behavior, the study seeks to describe how that behavior arises. In turn, a better understanding of how moral judgments are made may change our attitudes toward those judgments, Greene said.

The researchers conducted the study with two groups of nine people, who each answered a battery of 60 questions while undergoing MRI scanning. The researchers divided the questions into person and non-personal categories based on the general notion that the difference between the trolley and footbridge problems may have to do with the degree of personal involvement, and ultimately the level of emotional response.

Examples of non-personal ethical dilemmas included a case of keeping money from a lost wallet and a case of voting for a policy expected to cause more deaths than its alternatives. The researchers also included non-moral questions, such as the best way to arrange a travel schedule given certain constraints and which of two coupons to use at a store.

The scanning consistently showed a greater level of activation in emotion-related brain areas during the personal moral questions than during the impersonal moral or non-moral questions. At the same time, areas associated with working memory, which has been linked to ordinary manipulation of information, were considerably less active during the personal moral questions than during the others.

The researchers also measured how long it took subjects to respond to the questions. In the few cases in which people said it is appropriate to take action in the personal moral questions -- like pushing a person off the footbridge -- they tended to take longer to make their decisions. These delays suggest that this subgroup of people were working to overcome a primary emotional response, the researchers said.

Taken together, the imaging and response time results strongly suggest that emotional responses influenced moral decision-making and were not just a coincidental effect, the researchers concluded.

Professor of psychology John Darley, a coauthor of the paper, said the result fits into a growing area of moral psychology which contends that moral decision-making is not a strictly reasoned process, as has been believed for many years. "Moral issues do not come to you with a sign saying 'I'm a moral issue; treat me in a special way,'" Darley said. Instead, they engage a range of mental processes.

Other coauthors on the paper are Brian Sommerville, a former research assistant now at Columbia University Medical School; Leigh Nystrom, a research scientist in psychology; and Jonathan Cohen, a professor of psychology at Princeton.

Cohen also is director of the University's newly established Center for the Study of Brain, Mind and Behavior, which houses the fMRI scanner used in the study, and which seeks to combine the methods of cognitive psychology with neuroscience.

"Measuring people's behavior has served psychology well for many years and will continue to do so, but now that approach is augmented by a whole new set of tools," said Cohen.

Brain imaging allows scientists to build a catalog of brain areas and their functions, which can then be cross-referenced with behaviors that employ the same processes, Cohen said. Eventually, this combination of behavioral analysis and biological neuroscience could inform questions in fields from philosophy to economics, he said.

The current study, he said, "is a really nice example of how cognitive neuroscience -- and neuroimaging in particular -- provide an interface between the sciences and the humanities."


Story Source:

Materials provided by Princeton University. Note: Content may be edited for style and length.


Cite This Page:

Princeton University. "Brain Imaging Study Sheds Light On Moral Decision-Making." ScienceDaily. ScienceDaily, 14 September 2001. <www.sciencedaily.com/releases/2001/09/010914074303.htm>.
Princeton University. (2001, September 14). Brain Imaging Study Sheds Light On Moral Decision-Making. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2001/09/010914074303.htm
Princeton University. "Brain Imaging Study Sheds Light On Moral Decision-Making." ScienceDaily. www.sciencedaily.com/releases/2001/09/010914074303.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES