What Facebook gutting CrowdTangle means for misinformation
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.


What Facebook gutting CrowdTangle means for misinformation

This week, news broke that Facebook had in April reassigned dozens of employees at CrowdTangle, a social media monitoring tool it bought in 2016, in what may be an attempt to suppress insights about misinformation on social media.

The news is a worrying development for the field. CrowdTangle has enabled and shaped years’ worth of coverage of misinformation, especially since Facebook acquired it. The tool offers unique access to trending topics, public accounts and communities, and viral posts on Facebook, Instagram and Reddit that would otherwise be largely inaccessible. It’s a tool that we at First Draft regularly train journalists and researchers to use.

But CrowdTangle has always come with several caveats. The first is that it only shows publicly available information on Facebook. Most activity on the platform is private, so we only see a partial view through CrowdTangle.

The other is that CrowdTangle isn’t independent. As First Draft has written about before, platforms’ interests are baked into their analytics tools. Facebook has a history of fiddling with its data, which has always raised serious questions about our dependence on it as misinformation experts. Experts are also constrained by what features CrowdTangle does and does not offer, in ways that can steer attention and analysis. For example, users can’t filter for fact checked or labeled posts, which would help identify misinformation. If CrowdTangle had been built by and for misinformation researchers, it would look very different.

The news is also fueling speculation that CrowdTangle may be headed for closure. This wouldn’t be the first time Facebook has abruptly shut down a critical research tool: In June 2019, it suddenly removed Graph Search, which allowed researchers and activists to locate and verify eyewitness footage on the platform, and document human rights abuses. When Facebook shut down the feature, many techniques and tools built on it also shuttered. With CrowdTangle, researchers once again depend on an analytics tool that Facebook has the power to eliminate at a whim.

Journalists and researchers should consider ways to move toward purpose-built research tools over which platforms have minimal influence, and policymakers should find ways to fund them. This might sound ambitious, but this model has been successfully demonstrated by The Markup’s Citizen Browser and the Open Intelligence Lab’s tool 4CAT, an extremely powerful analytics tool funded by the European Research Council.

In the end, though, misinformation experts are in a bind. CrowdTangle remains an immensely useful tool. That leaves us with difficult questions. How can we account for what it prevents us from seeing? How do we reduce our dependence on it? And when should we stop using it altogether? — Tommy Shane

This article is from our daily briefing email newsletter. Subscribe for the key stories caught by our monitoring team each day, and be sure to check out our weekly briefing the best misinformation reads.

A roundup of the latest and most important misinformation narratives that you need to know about each day.

A weekly review of the best misinformation reads and talking points from around the world.

News from First Draft and invitations to all of our training and events.

Get briefings and updates delivered direct to your inbox.

Subscribe