Facebook deals another blow to transparency over disinformation
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

Facebook deals another blow to transparency over disinformation

On August 3, Facebook suspended the accounts of researchers from New York University’s Ad Observatory, an initiative that tracks political ads on the social media platform. In a statement, the company said the researchers had violated the platform’s terms of service by “using unauthorized means to access and collect data from Facebook.” The company — which also blocked the researchers’ API access used to retrieve data from the platform — said the researchers had created a browser extension that evaded platform privacy features and scraped data including usernames and links to profiles.

The website for NYU’s Ad Observer plugin says the tool does not collect personal information; in a response to the announcement, NYU researcher Laura Edelson said the plugin only collects data from those who consent to use it, and the data is anonymized. Edelson added that the researchers’ accounts were shut down only after the team members informed Facebook they were investigating links between disinformation on the platform and the January 6 US Capitol insurrection.

Regardless of the reasons behind Facebook’s move, which is being widely criticized — including by Samuel Levine, acting director of the Bureau of Consumer Protection — it represents the latest sign that the platform might be uncomfortable with the transparency demanded by researchers and journalists, especially around the real-world impacts of mis- and disinformation. Last month, Kevin Roose of The New York Times published a bombshell article outlining Facebook’s efforts to limit the scope of CrowdTangle, a tool relied on by many researchers and journalists.

Cutting off NYU’s researchers and hampering CrowdTangle are elements of a broader trend, suggests NBC News’ Olivia Solon, who highlighted other examples of Facebook leadership’s reluctance to heed warnings about harmful issues on the platform, including racial bias, repeated violations of misinformation policies, and the polarization of users.

But despite the growing number of hurdles to research on Facebook, third-party work is still being done to gauge the impact of mis- and disinformation. A team of researchers from several universities published a report July 27 suggesting that those who get their news from Facebook are more likely to resist Covid-19 vaccines. The report used the comparatively old-school method of surveying users of the platform. — First Draft Staff

This article is from our daily briefing email newsletter. Subscribe for the key stories caught by our monitoring team each day, and be sure to check out our weekly briefing the best misinformation reads.

A roundup of the latest and most important misinformation narratives that you need to know about each day.

A weekly review of the best misinformation reads and talking points from around the world.

News from First Draft and invitations to all of our training and events.

Get briefings and updates delivered direct to your inbox.

Subscribe