Health-related misinformation has garnered an estimated 3.8 billion views on Facebook between May 2019 and May 2020, reaching its peak in April 2020 - at the height of the COVID-19 pandemic - says a new report by advocacy group Avaaz.
The report - titled, "Facebook's Algorithm: A Major Threat to Public Health" - also mentioned that the top 10 health misinformation spreading websites had four times the views as equivalent content on websites of top 10 leading health institutions like the World Health Organisation (WHO) and Centre for Disease Control and Prevention (CDC). Despite such content being fact checked, only 16% of misinformation analysed had a warning label, while the other 84% went without any warnings, the report said.
According to Avaaz, their findings reveal that Facebook's efforts in minimising the spread of such health-related misinformation were outperformed by the amplification of such content by Facebook's own algorithm.
BOOM reached out to Facebook for a comment, and the article will be updated upon getting a response.
The Problematic Algorithm
Earlier this year, amid accusations that the platform was being used to spread dangerous health misinformation in the middle of a global health crisis, Facebook tied up with several governmental and non-governmental health organisations to weed out misinformation from its platforms, and promised to keep people safe and informed.
However, Fadi Quran, campaign director for Avaaz, told news agency Reuters that Facebook's own algorithm is undoing their efforts and posing a major threat to public health. "Mark Zuckerberg promised to provide reliable information during the pandemic, but his algorithm is sabotaging those efforts by driving many of Facebook's 2.7 billion users to health misinformation-spreading networks," Quran told the news agency.
A 2018 study by Oxford researchers found the issue with Facebook's algorithm - it was designed to maximise user engagement and the time spent by users on the platform. A Wall Street Journal article from May 2020 reported that a team of Facebook employees warned the executives that the algorithm was driving people apart and promoting divisiveness to keep people on the platform longer.
Avaaz believes that this gives an advantage to "the emotive, divisive content that characterises health misinformation".
Public Pages - A Fake News Menace?
The advocacy group analysed 82 websites spreading health related misinformation (flagged by NewsGuard as untrustworthy) along with 42 'superspreader' Facebook pages spanning at least five countries - United States, the UK, France, Germany, and Italy - between May 28, 2019 to May 27, 2020, that generated content reaching an estimate of 130 million interactions (equivalent to 3.8 billion estimated views). The report also mentioned that the sample analysed does not reveal the full scale of health related misinformation on the platform.
The report found that public pages on Facebook accounted for 43% of all the views to the top health misinformation spreading websites identified by Avaaz, and that the top 42 such pages alone generated an estimated 800 million views.
Avaaz also looked for affiliations to political ideology using ratings by NewsGuard, and found that while 61% of the pages did not show any clear affiliation to an ideology, 25.6% of such pages had far-right affiliations, making it the largest political orientation represented by the analysed websites.
Solutions
The report gave a two-step method of fixing the situation - provide independently fact checked corrections to all users who have come across such misinformation, and detecting and downgrading such posts and users who post health-related misinformation systematically, in the news feed.
It states that these steps would reduce the belief in misinformation by almost 50%, and cut down its reach by up to 80%.
Read the full report here.