New! Sign up for our free email newsletter.
Science News
from research organizations

A Good Lie Detector Is Hard To Find: 'Spin' And Fact Omission Leave No Neuro-trace

Date:
February 19, 2007
Source:
Massachusetts Institute of Technology
Summary:
In the not-too-distant future, police may request a warrant to search your brain. This was said only partly in jest by one of the panelists at a Feb. 2 symposium, "Is There Science Underlying Truth Detection?" sponsored by the American Academy of Arts and Sciences and the McGovern Institute for Brain Research at MIT.
Share:
FULL STORY

In the not-too-distant future, police may request a warrant to search your brain.

This was said only partly in jest by one of the panelists at a Feb. 2 symposium, "Is There Science Underlying Truth Detection?" sponsored by the American Academy of Arts and Sciences and the McGovern Institute for Brain Research at MIT.

The symposium explored whether functional magnetic resonance imaging (fMRI), which images brain regions at work, can detect lying. "There are some bold claims regarding the potential to use functional MRI to detect deception, so it's important to learn what is known about the science," said Emilio Bizzi, president of the American Academy of Arts and Sciences, an investigator at MIT's McGovern Institute for Brain Research and one of the organizers of the event.

The criminal justice system, counter-terrorists and even parents of adolescents would potentially benefit from accurate lie detection, but the risks and costs could be enormous, said panelist Henry Greely, professor of law and genetics at Stanford University. "Lie detection is being sold today and may well be being used today by our government here and overseas," he said. "There is little to no regulation of this technology."

At least two start-up companies claim to be able to use fMRI to detect deception. The companies plan to market their services to law enforcement and immigration agencies, the military, counterintelligence groups, foreign governments and corporations that want to screen applicants.

Not so fast, said panelist Nancy Kanwisher, the Ellen Swallow Richards Professor of Brain and Cognitive Sciences at MIT and an investigator at the McGovern Institute.

While her own research shows that you can accurately predict from fMRI data whether a person is thinking about a place or a face, Kanwisher thinks that neither the technology nor our understanding of the brain is sufficiently advanced to tell when a person is lying.

In 2005, two separate teams of researchers announced that their algorithms had been able to reliably identify "neural signatures" that indicated when a subject was lying. But the research, conducted on only a handful of subjects, was flawed, Kanwisher said.

Subjects were asked to lie about whether they were holding a certain card or whether they had "stolen" certain items. These are not actually lies, she pointed out, because subjects were asked to make such statements. "What does this have to do with real-world lie detection? Making a false response when instructed isn't a lie.

"In the real world, the stakes are higher. If a subject believes the fMRI could be used against him or send him to prison, this would cause extreme anxiety whether the individual is guilty or not guilty," potentially influencing their patterns of brain activity, she said.

In addition, the subject may not want to cooperate. "FMRI results are garbage if the subject is moving even a little bit. A subject can completely mess up the data by moving his tongue in his mouth or performing mental arithmetic," she said. Testing also poses problems. To ensure accurate results, fMRIs would have to be tested on a wide variety of people, some guilty and some innocent, and they would need to believe that the data would have real consequences on their lives. The work would need to published in peer-reviewed journals and replicated without conflicts of interest.

In short, Kaniwsher said, "There's no compelling evidence fMRIs will work for lie detection in the real world."

On one of the two panels, Marcus E. Raichle, a professor at Washington University School of Medicine, said that fMRI has been put forward as a more accurate version of the polygraph test. But Raichle, who contributed to a 2003 National Academy of Sciences report on the polygraph, said he hopes proponents of fMRI deception testing won't "make the same mistakes and fall in the same traps that the polygraph has fallen in," he said.

The polygraph, which measures lie-associated stress through accelerated heart rate, rapid breathing, rising blood pressure and increased sweating, is considered unreliable in scientific circles. Sociopaths who don't feel guilt and people who learn to inhibit their reactions to stress can outwit a polygraph, and negative results can often be attributed to coercion by the administrator.

Nevertheless, polygraphs are given to more than 40,000 people a year by the federal government and employers. The polygraph has been evaluated in "poorly controlled studies with little testing of its validity. It is unacceptable for this kind of screening," Raichle said. "We do need better ways to screen for deception and truth telling."

Panelist Elizabeth Phelps, professor of psychology and neural science at New York University, said fMRI is not likely to provide a more accurate screen. Regions of the brain dedicated to forming and retrieving memories can't tell imagined memories from real occurrences. "We may not be able to differentiate those things that are imagined or rehearsed from what actually happened," she said.

The United States should ban any use of a lie detection method until it is deemed safe and effective to the satisfaction of a watchdog government agency such as the Food and Drug Administration, Greely said.

"We need a consciously well thought-out federal statute saying no lie detection technique may be used unless proven safe and effective in a variety of ways--with trials of thousands of people, not 35 to 65 people. This includes tests on mentally ill people, nondepressives, people who have had a drink, people who have not had a drink. We need broad tests on broad numbers of people and broad sorts of lies. We need to act today to begin to ban" anything short of such a product, Greely said.

Panelist Jed Rakoff, a United States district judge in New York, said that eventually, neuroscience may have a significant impact on the legal system, but in the immediate future, it's "much more likely to cause mischief than be of real help." While as many as 90 percent of witnesses lie, the current method of cross examination as a lie-detection tool is considered effective. And fMRI could do little to ferret out what Rakoff said is the "biggest form of lying--omission. The practiced liar doesn't tell falsehoods, just omits key facts that would give a different spin."


Story Source:

Materials provided by Massachusetts Institute of Technology. Original written by Deborah Halber, News Office Correspondent. Note: Content may be edited for style and length.


Cite This Page:

Massachusetts Institute of Technology. "A Good Lie Detector Is Hard To Find: 'Spin' And Fact Omission Leave No Neuro-trace." ScienceDaily. ScienceDaily, 19 February 2007. <www.sciencedaily.com/releases/2007/02/070218184515.htm>.
Massachusetts Institute of Technology. (2007, February 19). A Good Lie Detector Is Hard To Find: 'Spin' And Fact Omission Leave No Neuro-trace. ScienceDaily. Retrieved November 21, 2024 from www.sciencedaily.com/releases/2007/02/070218184515.htm
Massachusetts Institute of Technology. "A Good Lie Detector Is Hard To Find: 'Spin' And Fact Omission Leave No Neuro-trace." ScienceDaily. www.sciencedaily.com/releases/2007/02/070218184515.htm (accessed November 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES