Earlier this month, Facebook took down a post from President Donald Trump’s campaign team that suggested children were ‘almost immune’ to the coronavirus © Valentin Flauraud/Reuters

Be the first to know about every new Coronavirus story

Facebook is failing to curb the rampant spread of medical disinformation on its platform, with misleading content generating an estimated 3.8bn views over the past year, according to a damning new report into the social media company.

The report from the US-based non-profit activism group Avaaz found that the spread of medical disinformation on Facebook far outstrips that of information from trustworthy sources, with the most popular “super spreader” sites receiving four times the number of clicks of bodies such as the US Center For Disease Control and World Health Organization.

The consumption of health misinformation on Facebook peaked in April, when the severity of the coronavirus pandemic was becoming clear to populations in the US and Europe, Avaaz said.

In that month alone, disinformation sites attracted an estimated 420m clicks to pages peddling hoaxes that included harmful information — such as supposed cures for Covid-19 — and meritless conspiracy theories targeting Microsoft co-founder and philanthropist Bill Gates. 

Just 16 per cent of the misleading or false articles analysed by Avaaz displayed a warning label from Facebook’s third-party fact-checkers. The report’s authors said 40 per cent of clicks were generated by just 10 leading misinformation websites.

Avaaz called for Facebook to notify users who have accessed material that has later been disputed by fact-checking, and to do more to reduce the virality of information from sources known to be misleading.

“This infodemic will make the pandemic worse unless Facebook detoxifies its algorithm and provides corrections to everyone exposed to these viral lies,” said Avaaz campaign director Fadi Quran.

Facebook said it had not seen Avaaz’s findings, but argued the conclusions did not reflect the steps Facebook has taken to limit misinformation.

“Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98 million pieces of Covid-19 misinformation and removed seven million pieces of content that could lead to imminent harm,” Facebook told the FT.

“We’ve directed over two billion people to resources from health authorities and when someone tries to share a link about Covid-19, we show them a pop-up to connect them with credible health information.”

Nevertheless, the Avaaz report makes grim reading for Facebook chief executive Mark Zuckerberg, who pledged in July to throw Facebook’s moderating resources into tackling the so-called “infodemic”. Efforts include launching an information hub directing users to trusted information, while Mr Zuckerberg conducted a live interview with infectious disease expert Dr Anthony Fauci.

Facebook’s pledge to stamp out coronavirus-related misinformation that could lead to real world harm has put the company at odds with the White House. Earlier this month, Facebook took down a post from President Donald Trump’s campaign team that suggested children were “almost immune” to the coronavirus.

The White House called the removal “another display of Silicon Valley’s flagrant bias against this president”.

Latest coronavirus news

Follow FT's live coverage and analysis of the global pandemic and the rapidly evolving economic crisis here.

Facebook did receive some praise on Tuesday, however, for reacting quickly to a new video positioning itself as the sequel to “Plandemic”, a baseless conspiracy theory-laden film that attracted millions of views over more than a week before being removed by Facebook and other platforms.

By contrast, several websites hosting Plandemic 2 on Tuesday were blocked from being shared on Facebook not long after the new “documentary” was made available.

“This is a critical step in stopping the spread of deadly disinformation and I commend the platforms for taking a proactive role,” said Brandie Nonnecke, director of the CITRIS Policy Lab at UC Berkeley, which studies the effects of misinformation.

Health-related misinformation is not only a Facebook problem. In May, research from Newsguard, which monitors and rates news websites on trustworthiness, found dozens of posts on Twitter shared by high-profile accounts that flouted the social media group’s policies banning the promotion of questionable virus therapies and cures.


Get alerts on Facebook Inc when a new story is published

Copyright The Financial Times Limited 2020. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article