First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

3 lessons on the coronavirus ‘infodemic’ from experts and tech companies

The UK Parliament’s select committee for digital, culture, media and sport, where elected representatives grill the rich and powerful over matters of public interest, has played host to numerous dramatic scenes in recent years. This is where the details of the Cambridge Analytica scandal played out as part of the committee’s inquiry into disinformation and ‘fake news’, where some of the myriad ways political actors manipulate social media to influence voters was brought to light for a mainstream audience.

As part of its ongoing inquiry into online harms and disinformation, the committee summoned assorted experts and technology company representatives on Thursday April 30 to answer questions on the subject of misinformation and the coronavirus pandemic.

First Draft’s co-founder and US director Dr Claire Wardle, as well as Professor Philip Howard, the director of the Oxford Internet Institute, and Stacie Hoffmann, a digital policy consultant at Oxford Information Labs were asked for their opinions before representatives from Google, Facebook and Twitter faced questions. 

We’ve picked out some of the key lessons and talking points from the session.

Understanding the psychology behind coronavirus-related misinformation is crucial

To reduce the spread of misinformation relating to the coronavirus, it’s important to understand what’s driving it. And the answer lies somewhere in human psychology. 

“We have to recognise that there’s a lot of misinformation, and people are becoming nodes for this because they’re scared,” Dr Wardle told the politicians. 

When there is a constant flow of new coronavirus reports, she said people are often sharing rumours and unverified information with family and friends “just in case”. 

“That dynamic is critical to how we’re seeing people respond to the pandemic,” she added.

The outbreak has prompted a slew of viral messages and hoaxes on WhatsApp and other messaging services claiming that various authorities are on the verge of announcing a complete lockdown, or that dubious practices like inhaling hot air can cure the coronavirus.

Dr Wardle also noted that we’re not just seeing an increase in people sharing unverified information but also an uptick in conspiracy theories. 

Whether they lead to the burning of mobile telephone masts, claims that the virus is a man-made bioweapon, or doubts around the potential vaccine, these fringe theories can have serious consequences.

“It’s easy to dismiss conspiracies, but we have to understand why they’re taking hold,” according to Dr Wardle. “There isn’t a good origin story for the virus, and so this information vacuum is allowing misinformation to circulate.

“The reason people attach themselves to conspiracies is because they are simple, powerful narratives. Right now, people are desperate for an explanation of what they’re going through.

“They feel out of control, and conspiracies give them control because it gives an explanation that they’re lacking.” 

Influencers can be a ‘gateway drug’

The subject of celebrities and influencers spreading misinformation online was raised repeatedly. While the 5G conspiracies had been gaining traction in online spaces since January, they reached new heights when a number of high-profile celebrities shared the theory in early April.

Professor Philip Howard of the Oxford Internet Institute said that, in some ways, public figures could be considered a “gateway drug” to misinformation.

“If a prominent Hollywood star or a prominent political figure says something that is not consistent with the science or the public health advice, some people will go looking for it and spread it.” 

Both Professor Howard and Stacie Hoffman, from Oxford Information Labs, were asked about the questions of responsibility when it comes to public figures sharing misinformation. Hoffman suggested that guidelines may be needed to outline what standards are expected of high-profile accounts.

This topic came up again in the questions put to Katy Minshall, Twitter’s UK Head of Government, Public Policy and Philanthropy, particularly on the issue of users with blue ticks — the platform’s system for verifying the identity of a well-known figure.

“I can assure the committee that if any account — verified, or not — breaks any rules, they will be subject to enforcement action,” she said.

At the end of March, both Twitter and Facebook took action to remove misleading posts from Brazil’s president, Jair Bolsonaro. In a video, the president endorsed the antiviral drug hydroxychloroquine and encouraged an end to social distancing measures.

When asked about recent comments made by US President Donald Trump regarding the use of disinfectant as treatment, Minshall told the committee that Twitter had blocked the hashtag #InjectDisinfectant from trending. Videos of Trump making the statement, however, are allowed on the platform

The platforms’ responses to the ‘infodemic’ raise questions about the future

Social media platforms have been stepping up measures to meet the unprecedented public health crisis, from changing their policies, to tweaking their algorithms, and these changes have not gone unnoticed.

“Covid-19 is a different type of stress test for these platforms because we’re really starting to see what they can do,” said Hoffman.

Since the outbreak, measures have included notifying users who have been exposed to debunked misinformation and collaborating with governments on chatbots that can respond to people’s questions.

Minshall told the politicians that Twitter had challenged 3.4 million accounts that had appeared to engage suspiciously in coronavirus-related conversations.

Google, Facebook and Twitter all mentioned their efforts to direct users off their platforms towards authoritative official sources on health matters.

“The number one ask we have heard from government and health organisations is that people need clear, concise information from one of two single sources about what to do,” said Alina Dimofte, Public Policy and Government Relations Manager at Google.  

“What we want to do is to empower people with this information.”

While the expert witnesses also mentioned the importance of building trust in official sources, they proposed additional ways the platforms could help. 

Dr Wardle and Professor Howard referenced the need for more transparency in the information they share with researchers. 

“We’re right now in the middle of a natural experiment,” said Dr Wardle, “so what I would like to see is the platforms do more but then allow academics to test alongside them to see what the effects are.

“All of them are doing different things but what we’re lacking is transparency and oversight.”

It also still remains to be seen whether the measures implemented in response to coronavirus misinformation will be permanent.

“It’ll be really interesting to see if they use these going forward — if they stay in place post-pandemic,” said Hoffman.

Watch the full session here.

Stay up to date with First Draft’s work by becoming a subscriber and follow us on Facebook and Twitter.