Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Explainers

How To Be A Fact Checking Warrior On WhatsApp? New Study Sheds Light

A study has revealed that user-driven corrections of misinformation on WhatsApp can go a long way in slowing its spread.

By - Archis Chowdhury | 29 Jan 2020 5:41 PM IST

Ever wondered how you can contribute to combating misinformation on platforms like WhatsApp? A study found that, in the context of Indian WhatsApp users, user-driven corrections were effective in lowering people's belief in misinformation.

Sumitra Badrinathan from the University of Pennsylvania, Simon Chauchard from Leiden University and D. J. Flynn from IE University recently conducted a study titled ""I Don't Think That's True, Bro!" An Experiment on Fact-checking WhatsApp Rumors in India", where they investigated the role of users in fact checking mis/disinformation on WhatsApp.

The study revealed that corrections to potentially misleading information on WhatsApp threads can minimise belief in the content of such messages, even when such corrections are low on sophistication (without a source), and the identity of the user is unknown.

The researchers recommended that WhatsApp create a "button" to easily express their doubts over claims made in the app, which would minimse the efforts required by users to report a message and would thus effectively slow down the dissemination of such messages.

"Our findings suggest that though user-driven corrections work, merely signaling a doubt about a claim (regardless of how detailed this signal is) may go a long way in reducing misinformation," Badrinathan said in a tweet.

Click here to read the study.

The Study

The study was conducting by recruiting over 5000 Hindi speakers through Facebook, who were exposed to nine different WhatsApp threads. These threads (screenshot of a WhatsApp conversation) included a claim made by an unknown users using pro-ruling party and anti-ruling party sources, which was followed by a response by another user. The response was varied from a simple "thank you" (control condition with no correction) to expressing simple disbelief to fact checking the claim using a source.

The subject of the claim in the nine threads were varied from politics, health to sports, while the fact checks by the respondent in the thread used one of these five sources: AltNews, VishwasNews, Times of India, Facebook and WhatsApp.

Sample claims and corrective responses

Existing literature on people's response to fact checking initiatives in countries like the United States have found that motivated reasoning and partisanship are highly influential factors that contribute to the acceptance of a fact check.

However, in the context of WhatsApp users in India, the study found that motivated reasoning and partisanship had less of a role to play for user's in their interaction with the claim and the fact check.

It also found that the sophistication of the message (cited with fact checks by media organisations) had little to do with people believing in the fact check by the responding user. Rather, an expression of doubt or a counter argument by peer in a WhatsApp group was enough to lower the belief in the initial claim.

The "Beacon" Of Doubt

The study argues that expecting users in real life to consistently counter claims made by their peers with sophistication and details would be unrealistic. The researchers suggested that a button-like feature should be added to messengers like WhatsApp, which would allow users to express doubt over a claim with the simple click of a button.

Also Read Prejudice Leads To More Misinformation Than Low Tech Literacy: LSE Study

Last year, a similar suggestion was made by a few researchers at the London School of Economics, who conducted a WhatsApp-funded study in India to investigate the role of the messenger in orchestrating and influencing mob violence around the country. They had suggested the addition of a "beacon" like feature for users to flag potentially dangerous misinformation that may lead to violence.


Tags: