New! Sign up for our free email newsletter.
Science News
from research organizations

Tech fixes can't protect us from disinformation campaigns

US should focus on the psychology of false beliefs

Date:
April 25, 2019
Source:
Ohio State University
Summary:
More than technological fixes are needed to stop countries from spreading disinformation on social media platforms like Facebook and Twitter, according to two experts. Policymakers and diplomats need to focus more on the psychology behind why citizens are so vulnerable to disinformation campaigns.
Share:
FULL STORY

More than technological fixes are needed to stop countries from spreading disinformation on social media platforms like Facebook and Twitter, according to two experts.

Policymakers and diplomats need to focus more on the psychology behind why citizens are so vulnerable to disinformation campaigns, said Erik Nisbet and Olga Kamenchuk of The Ohio State University.

"There is so much attention on how social media companies can adjust their algorithms and ban bots to stop the flood of false information," said Nisbet, an associate professor of communication.

"But the human dimension is being left out. Why do people believe these inaccurate stories?"

Russia targeted American citizens during the 2016 election with posts on every major social media platform, according to reports produced for U.S. Senate investigators.

This is just one example of how some countries have distributed "fake news" to influence the citizens of rival nations, according to the researchers.

In an invited paper just released in The Hague Journal of Diplomacy, Nisbet and Kamenchuk, a research associate at Ohio State's Mershon Center for International Security Studies, discussed how to use psychology to battle these disinformation campaigns.

"Technology is only the tool to spread the disinformation," Kamenchuk said.

"It is important to understand how Facebook and Twitter can improve what they do, but it may be even more important to understand how consumers react to disinformation and what we can do to protect them."

The researchers, who are co-directors of the Mershon Center's Eurasian Security and Governance Program, discussed three types of disinformation campaigns: identity-grievance, information gaslighting and incidental exposure.

Identity-grievance campaigns focus on exploiting real or perceived divisions within a country.

"The Russian Facebook advertisements during the 2016 election in the United States are a perfect example," Nisbet said. "Many of these ads tried to inflame racial resentment in the country."

Another disinformation strategy is information gaslighting, in which a country is flooded with false or misleading information through social media, blogs, fake news, online comments and advertising.

A recent Ohio State study showed that social media has only a small influence on how much people believe fake news. But the goal of information gaslighting is not so much to persuade the audience as it is to distract and sow uncertainty, Nisbet said.

A third kind of disinformation campaign simply aims to increase a foreign audience's everyday, incidental exposure to "fake news."

State-controlled news portals, like Russia's Sputnik, may spread false information that sometimes is even picked up by legitimate news outlets.

"The more people are exposed to some piece of false information, the more familiar it becomes, and the more willing they are to accept it," Kamenchuk said. "If citizens can't tell fact from fiction, at some point they give up trying."

These three types of disinformation campaigns can be difficult to combat, Nisbet said.

"It sometimes seems easier to point to the technology and criticize Facebook or Twitter or Instagram, rather than take on the larger issues, like our psychological vulnerabilities or societal polarization," he said.

But there are ways to use psychology to battle disinformation campaigns, Kamemchuk and Nisbet said.

One way is to turn the tables and use technology for good. Online or social-media games such as Post-Facto, Bad News and The News Hero teach online fact-checking skills or the basic design principles of disinformation campaigns.

Because campaigns to spread false information often depend on stoking negative emotions, one tactic is to deploy "emotional dampening" tools. Such tools could include apps and online platforms that push for constructive and civil conversations about controversial topics.

More generally, diplomats and policymakers must work to address the political and social conditions that allow disinformation to succeed, such as the loss of confidence in democratic institutions.

"We can't let the public believe that things are so bad that nothing can be done," Kamenchuk said.

"We have to give citizens faith that what they think matters and that they can help change the system for the better."


Story Source:

Materials provided by Ohio State University. Original written by Jeff Grabmeier. Note: Content may be edited for style and length.


Journal Reference:

  1. Erik C. Nisbet, Olga Kamenchuk. The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy. The Hague Journal of Diplomacy, 2019; 14 (1-2): 65 DOI: 10.1163/1871191X-11411019

Cite This Page:

Ohio State University. "Tech fixes can't protect us from disinformation campaigns." ScienceDaily. ScienceDaily, 25 April 2019. <www.sciencedaily.com/releases/2019/04/190425115634.htm>.
Ohio State University. (2019, April 25). Tech fixes can't protect us from disinformation campaigns. ScienceDaily. Retrieved November 9, 2024 from www.sciencedaily.com/releases/2019/04/190425115634.htm
Ohio State University. "Tech fixes can't protect us from disinformation campaigns." ScienceDaily. www.sciencedaily.com/releases/2019/04/190425115634.htm (accessed November 9, 2024).

Explore More

from ScienceDaily

RELATED STORIES