Facebook removed 20 million Covid-19 misinformation posts. Is it enough?
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

This website is hosted in perpetuity by the Internet Archive.

Facebook removed 20 million posts for Covid-19 misinformation. Is it enough?

On Wednesday, Facebook announced that between April and June it had removed 20 million posts that contained Covid-19 misinformation. The platform also said that warning labels had been added to more than 190 million Covid-19-related posts. The data was released as part of the platform’s Community Standards Enforcement Report, and, starting this past Wednesday, is accompanied every quarter by the Widely Viewed Content Report.

Also on Wednesday, Facebook outlined in a blog post the steps it is taking against vaccine misinformation “superspreaders.” They included removing over three dozen Pages, Groups and Facebook or Instagram accounts linked to the “Disinformation Dozen,” even as it shot back against the report from the Center for Countering Digital Hate that said 12 individuals were responsible for 65 per cent of vaccine misinformation on Facebook. (First Draft has also pointed out some limitations of the CCDH report, which had been cited by the White House and the US Surgeon General amid the government’s push against Covid-19-related misinformation.)

While Facebook’s efforts appear to mark big steps and significant numbers, it’s hard to put them into context. This week, a feature in Vox’s Recode explored just how bad Facebook’s vaccine misinformation problem is, but like many pieces before it, couldn’t pin down an answer. “That’s in large part because Facebook isn’t giving researchers enough of the real-time data they need to figure out exactly how much Covid-19 misinformation is on the platform, who’s seeing it, and how it’s impacting their willingness to get vaccinated,” wrote Shirin Ghaffary.

Ghaffary is not the first to raise concerns about Facebook’s transparency. First Draft recently wrote about Facebook’s decision to suspend the accounts of researchers from New York University’s Ad Observatory; last month it was reported that in April, Facebook gutted its CrowdTangle tool used by misinformation experts.

Yet even if experts are unable to quantify the full extent of Covid-19 misinformation on the platform, there are significant signs of its real-world impact. A July report from the Covid States Project — a collaborative effort by researchers from several universities — found that Facebook news consumers were less likely to get vaccinated than Fox News audiences. With cases, hospitalization and deaths surging in many parts of the US, the benefits of vaccination — and the harm from anti-vaccine content — are hard to overstate. — First Draft staff

This article is from our daily briefing email newsletter. Subscribe for the key stories caught by our monitoring team each day, and be sure to check out our weekly briefing the best misinformation reads.

A roundup of the latest and most important misinformation narratives that you need to know about each day.

A weekly review of the best misinformation reads and talking points from around the world.

News from First Draft and invitations to all of our training and events.