On August 3, Facebook suspended the accounts of researchers from New York University’s Ad Observatory, an initiative that tracks political ads on the social media platform. In a statement, the company said the researchers had violated the platform’s terms of service by “using unauthorized means to access and collect data from Facebook.” The company — which also blocked the researchers’ API access used to retrieve data from the platform — said the researchers had created a browser extension that evaded platform privacy features and scraped data including usernames and links to profiles.
The website for NYU’s Ad Observer plugin says the tool does not collect personal information; in a response to the announcement, NYU researcher Laura Edelson said the plugin only collects data from those who consent to use it, and the data is anonymized. Edelson added that the researchers’ accounts were shut down only after the team members informed Facebook they were investigating links between disinformation on the platform and the January 6 US Capitol insurrection.
Regardless of the reasons behind Facebook’s move, which is being widely criticized — including by Samuel Levine, acting director of the Bureau of Consumer Protection — it represents the latest sign that the platform might be uncomfortable with the transparency demanded by researchers and journalists, especially around the real-world impacts of mis- and disinformation. Last month, Kevin Roose of The New York Times published a bombshell article outlining Facebook’s efforts to limit the scope of CrowdTangle, a tool relied on by many researchers and journalists.
Cutting off NYU’s researchers and hampering CrowdTangle are elements of a broader trend, suggests NBC News’ Olivia Solon, who highlighted other examples of Facebook leadership’s reluctance to heed warnings about harmful issues on the platform, including racial bias, repeated violations of misinformation policies, and the polarization of users.
But despite the growing number of hurdles to research on Facebook, third-party work is still being done to gauge the impact of mis- and disinformation. A team of researchers from several universities published a report July 27 suggesting that those who get their news from Facebook are more likely to resist Covid-19 vaccines. The report used the comparatively old-school method of surveying users of the platform. — First Draft Staff