First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

The daily briefing

The best reads from around the web on disinformation and information disorder.

Why Instagram Is Worse for Body Image Than Facebook (Psychology Today)


The coronavirus pandemic has taught us many things, and high up the list must be people’s need for social contact. That is the reason social media has taken off so spectacularly: It provides a facsimile of real interaction, distorted through the lens of whatever interactions a platform prioritizes. But, like anything, there are downsides. Mental health and body image often top that list. A new study compared the impact of scrolling through Instagram to that of Facebook among more than 300 female college students, and found the selfie-filled feeds of Instagram had a greater negative effect. A BBC article today raises similar concerns over TikTok, which often features fitness addicts or slim, scantily clad teenagers among promoted posts. The recommendations for both are similar: Greater moderation on the part of the platforms and a more rounded selection of accounts to follow from users.

Social Media Giants Support Racial Justice. Their Products Undermine It. (The New York Times)


Social media platforms have come under fire in the last few months, first for not stemming the tide of misinformation during the coronavirus pandemic, and now because their platforms can amplify racist commentators. Big Tech’s executives have proclaimed that “Black Lives Matter” and donated millions to charities fighting for racial equality, but have done little to ensure their platforms promote the same, says The New York Times’ Kevin Roose. Twitter, YouTube, and Facebook are used by far-right individuals to aggressively further their agenda, to the point where the  feeds no longer reflect the majority’s perspectives. Most American’s support the protests, according to a recent poll, although this isn’t always obvious when scrolling social media. Rashad Robinson, the president of civil rights organization Color of Change, said the companies need to apply “anti-racist principles” to their products to change the way they rank information and amplify anti-racist voices. “I don’t think they can truly mean ‘Black Lives Matter’ when they have systems that put black people at risk,” he said.

Meeting the challenge of vaccine hesitancy (The Aspen Institute)


Amid the current global health crisis, vaccine hesitancy is one of the most pressing issues facing policymakers today. This report from the Sabin Vaccine Institute, produced in collaboration with the Aspen Institute, offers a holistic understanding of this growing phenomenon and sets out ambitious but practical action plans to counter it. In addition to a range of sociological and behavioral science insights, the report includes a section dedicated to understanding the role online misinformation plays in fueling vaccine hesitancy. This chapter, written by Renée DiResta, technical research manager at the Stanford Internet Observatory, and First Draft’s own Claire Wardle, details the dominant anti-vaccination narratives, the most effective types of content that help promote them, and the key technical features of social media networks that shape the dynamics of online vaccination information. The full report, “Meeting the Challenge of Vaccine Hesitancy,” is free to download.

Coronavirus: How a false rumour led to hate online (BBC News)


What do QAnon, Bill Gates, and a farm in Surrey, England, have in common? All three are linked to coronavirus conspiracy theories that take a tiny kernel of truth and spin it into a grand web of Machiavellian intrigue. The truth is much more boring, as this BBC investigation finds out. The Pirbright Institute in Surrey does have a patent linked to coronavirus — which is a family of viruses, remember — but that’s connected to research into a vaccine for chickens. It has nothing to do with SARS-CoV-2, the name of the new type of coronavirus that  has swept the world. Nonetheless, staffers have faced months of abuse from online sources that have capitalized on the confusion. “When conspiracy theories start, people don’t realize there are humans on the end of social media accounts,” said the Institute’s head of communications. Telling the human stories of disinformation, like this, can hopefully humanize their effects.

Time to aim high: tackling disinformation during the pandemic (Euractiv)


Despite actions taken by platforms and governments alike to fight disinformation, the onslaught of hoaxes brought on by the pandemic has confirmed how much still needs to be done. This is why Juliane von Reppert-Bismarck of Lie Detectors, a European journalist-led news literacy initiative, is calling for a course correction. In a blog post for Euractiv, she identifies the problems with current policies targeting misinformation on social media. “To demand that Facebook increases its army of fact-checkers and Youtube stops recommending conspiracy theories, and to promise to pursue Chinese and Russian trolls, makes good headlines,” she said. But these framings tackle only part of the problem. Von Reppert-Bismarck advocates for a two-pronged approach: that actions taken to regulate the platforms be accompanied by media literacy programs. Crucially, there is still time for entities like the EU to increase their ambitions — “but it needs to start now,” von Reppert-Bismarck said.

Facebook Groups Are Destroying America (WIRED)


Facebook groups are among the most important spaces on the social web. Whether set as public or private, they can host tens if not thousands of individual users and may range from international action collectives to hyper-local, community-based environments. And as any disinformation researcher will tell you, they are also among the most crucial battlegrounds of online disinformation. In this piece for WIRED, experts Nina Jankowicz and Cindy Otis point to recent studies of health and political disinformation to suggest that Facebook groups, as opposed to bots or Russian-bought digital ads, should be one of the key areas of concern ahead of this year’s US presidential election. Greater transparency regarding the ownership details and membership of these groups, as well as an end to Facebook’s “suggestions” algorithm, they argue, could help prevent an otherwise inevitable “earthquake” in late 2020

Coronavirus misinformation, and how scientists can help to fight it (Nature)


Could scientists be the answer to fighting misinformation about the coronavirus? False medical claims have spread like wildfire on social media platforms, with everything from plant oils to hot water supposed to cure the virus. Scientists and medical professionals can help erase the misconceptions, and some organizations encourage them to do so. Fiona Fox, chief executive of the UK-based Science Media Centre, encourages scientists to get online and share their expertise: “Especially during a pandemic, when there’s a sea of misinformation, uncertainty and rumors circulating, the public needs to hear from scientists with deep expertise who really know what they’re talking about.” It’s a hard decision for health workers, already overburdened with the pressure to treat patients and find a cure, to get online to counter misinformation, but filling in some of the information voids and avoiding direct confrontation with hostile users or trolls could help stem the tide.

Facebook, YouTube usage linked to belief in coronavirus conspiracy theories, study finds (CNBC)


A new study has revealed that users of social media giants Facebook and YouTube are more likely to believe conspiracy theories about the coronavirus. The study claimed that 60 per cent  of people who believe Covid-19 is linked to 5G got their information from YouTube, while more than half of those who believe “there’s no hard evidence Covid-19 exists” get their information from Facebook. Misinformation about the virus has had dangerous consequences, from mobile signal towers being set alight to individuals taking harmful, unscientific treatments. It is not surprising that social media platforms are linked to belief in conspiracy theories, considering how widely they are used to spread misinformation. Both YouTube and Facebook have said they are fighting the spread of false information on their platforms and removing misleading material.

You’re Living in the Golden Age of Conspiracy Theories (POLITICO)


2020 is the year conspiracy theories exploded across mainstream media, fueled by the panic and instability brought by the coronavirus. “This pandemic is ripe ground for conspiracy theories, precisely because a lot of the psychological elements that give rise to conspiracy theories are heightened: powerlessness and anxiety and uncertainty,” says conspiracy theory researcher Adam Enders. Now conspiracies and misinformation are increasingly being used as weapons for political gain in the United States. One study from 2018 concluded that Democrats were more likely to believe in conspiracies if they were attributed to Republicans, and vice versa. As “ill-defined” and “nebulous” as they are, correcting conspiracies is especially tricky in a world where anyone can say anything on social media and have it amplified by other users.

A fake-news purveyor apparently invented a Seattle counterprotest — that could become real (The Washington Post)


The last US presidential election brought to light a host of new players in the disinformation arena. Since then, platforms, politicians, and members of the public are more aware of the issue — but stories like these show there’s still a long way to go. The Washington Post’s Philip Bump writes of recently rediscovering a purveyor of “fake news” that he exposed in 2016. The site, Prntly, went dark for years, and the owner claimed to be the victim of platform censorship. But that’s changing now. “Four years later, with another presidential election looming, Prntly is back and propagating false information once again,” writes Bump. Recently, it created a counterprotest event purportedly involving groups who later publicly denied involvement. It’s not clear how Prntly is now getting around the platforms’ misinformation rules, and Facebook declined to comment for this story. Regardless, Prntly’s resurrection could make an interesting case study for journalists and researchers covering misinformation around the election.