News

Behavioral Science Can Be Used to Win War with Fake News News
(Image via www.vpnsrus.com)

Behavioral Science Can Be Used to Win War with Fake News

May 18, 2018 2702

Fake news on screen

(Image via www.vpnsrus.com)

Facebook CEO Mark Zuckerberg recently acknowledged his company’s responsibility in helping create the enormous amount of fake news that plagued the 2016 election – after earlier denials. Yet he offered no concrete details on what Facebook could do about it.

Fortunately, there’s a way to fight fake news that already exists and has behavioral science on its side: the Pro-Truth Pledge project.

I was part of a team of behavioral scientists that came up with the idea of a pledge as a way to limit the spread of misinformation online. Two studies that tried to evaluate its effectiveness suggest it actually works.

Fighting fake news

A growing number of American lawmakers and ordinary citizens believe social media companies like Facebook and Twitter need to do more to fight the spread of fake news – even if it results in censorship.

A recent survey, for example, showed that 56 percent of respondents say tech companies “should take steps to restrict false info online even if it limits freedom of information.”

The Conversation logo

This article by Gleb Tsipursky originally appeared at The Conversation, a Social Science Space partner site, under the title “War on fake news could be won with the help of behavioral science

But what steps they could take – short of censorship and government control – is a big question.

Before answering that, let’s consider how fake news spreads. In the 2016 election, for example, we’ve learned that a lot of misinformation was a result of Russian bots that used falsehoods to try to exacerbate American religious and political divides.

Yet the posts made by bots wouldn’t mean much unless millions of regular social media users chose to share the information. And it turns out ordinary people spread misinformation on social media much faster and further than true stories.

In part, this problem results from people sharing stories without reading them. They didn’t know they were spreading falsehoods.

However, 14 percent of Americans surveyed in a 2016 poll reported knowingly sharing fake news. This may be because research shows people are more likely to deceive others when it benefits their political party or other group to which they belong, especially when they see others from that group sharing misinformation.

Fortunately, people also have a behavioral tic that can combat this: We want to be perceived as honest. Research has shown that people’s incentive to lie decreases when they believe there is a higher risk of negative consequences, are reminded about ethics, or commit to behaving honestly.

That’s why honor codes reduce cheating and virginity pledges delay sexual onset.

Taking the pledge

That’s where the “pro-truth pledge” comes in.

Appalled by the misinformation that characterized both the U.S. elections and U.K. Brexit campaign, a group of behavioral scientists at The Ohio State University and the University of Pennsylvania, including me, wanted to create a tool to fight misinformation. The pledge, launched in December 2016, is a project of a nonprofit I co-founded called Intentional Insights.

The pledge aims to promote honesty by asking people to commit to 12 behaviors that research shows correlate with an orientation toward truthfulness. For example, the pledge asks takers to fact-check information before sharing it, cite sources, ask friends and foes alike to retract info shown to be false, and discourage others from using unreliable news sources.

So far, about 6,700 people and organizations have taken the pledge, including American social psychologist Jonathan Haidt, Australian moral philosopher Peter Singer, Media Bias/Fact Check and U.S. lawmakers Beto O’Rourke, Matt Cartwright and Marcia Fudge.

About 10 months after launching the pledge, my colleagues and I wanted to evaluate whether in fact it has been effective at changing behavior and reducing the spread of unverified news. So we conducted two studies comparing pledge-takers’ sharing on Facebook. To add a little outside perspective, we included a researcher from the University of Stuttgart who did not take part in creating the pledge.

In one study, we asked participants to fill out a survey evaluating how well their sharing of information on their own and others’ profile pages aligned with the 12 behaviors outlined in the pledge a month before and after they signed it. The survey revealed large and statistically significant changes in behavior, including more thorough fact-checking, a growing reluctance to share emotionally charged posts, and a new tendency to push back against friends who shared information.

While self-reporting is a well-accepted methodology that emulates the approach of studies on honor codes and virginity pledges, it’s subject to the potential bias of subjects reporting desirable changes – such as more truthful behaviors – regardless of whether these changes are present.

So in a second study we got permission from participants to observe their actual Facebook sharing. We examined the first 10 news-relevant posts one month after they took the pledge and graded the quality of the information shared, including the links, to determine how closely their posts matched the behaviors of the pledge. We then looked at the first 10 news-relevant posts 11 months before they took the pledge and rated those. We again found large, statistically significant changes in pledge-takers’ adherence to the 12 behaviors, such as fewer posts containing misinformation and including more sources.

Clarifying ‘truth’

The reason the pledge works, I believe, is because it replaces the fuzzy concept of “truth,” which people can interpret differently, with clearly observable behaviors, such as fact-checking before sharing, differentiating one’s opinions from the facts and citing sources.

The ConversationThe pledge we developed is only one part of a larger effort to fight misinformation. Ultimately, this shows that simple tools exist and can be used by Facebook and other social media companies to battle the onslaught of misinformation people face online, without resorting to censorship.The Conversation


Related Articles

Alondra Nelson Named to U.S. National Science Board
Announcements
October 18, 2024

Alondra Nelson Named to U.S. National Science Board

Read Now
Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions
Research
October 10, 2024

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

Read Now
Lee Miller: Ethics, photography and ethnography
News
September 30, 2024

Lee Miller: Ethics, photography and ethnography

Read Now
‘Settler Colonialism’ and the Promised Land
International Debate
September 27, 2024

‘Settler Colonialism’ and the Promised Land

Read Now
NSF Seeks Input on Research Ethics

NSF Seeks Input on Research Ethics

In a ‘Dear Colleague’ letter released September 9, the NSF issued a ‘request for information,’ or RFI, from those interested in research ethics.

Read Now
Artificial Intelligence and the Social and Behavioral Sciences

Artificial Intelligence and the Social and Behavioral Sciences

Intelligence would generally be reckoned as the province of the social and behavioral sciences, so why is artificial intelligence so often relegated […]

Read Now
Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson’s work finds little success in applying reforms derived from certain types of social science research on criminal justice.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments