Scientists can’t predict the future, but they can help us prevent misinformation
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

This website is hosted in perpetuity by the Internet Archive.

Scientists can’t predict the future, but they can help us prevent misinformation

Image: First Draft Montage / Photography: Pexels (Anna Shvets, Artem Podrez)

Exploring collaborative solutions between science communicators and journalists to help combat misinformation. 

First Draft’s APAC Director, Anne Kruger, moderated a panel in Australia exploring collaborative solutions from different stakeholders to combat misinformation. 

Promoting a more nuanced and sophisticated dialogue about misinformation has never been more important when it comes to science and public health communication. Case studies from not-too-distant history illustrate how vaccination rollouts can go horribly wrong at the intersection of government policy, media coverage and online misinformation. The example of Dengvaxia in the Philippines is a cautionary tale as governments, including Australia, make adjustments following advice on possible rare side effects of the AstraZeneca Covid-19 vaccine.

A virtual panel in Australia on March 31 brought together experts from the science communications, medical academia and fact-checking sectors. The opportunities from better collaboration among the industries of journalism, science, academia, platforms and government were quickly apparent. The panel considered the pressures on journalists to provide quick headlines, which is at odds with the scientific world’s more measured approach. But it is crucial for stakeholders to understand their differences and, most importantly, the needs of the public, and work together to slow the spread of misinformation and increase the reach of facts.

Predicting the public’s questions before misinformation gives the answers

When there is a void of quality information to answer the public’s questions, it opens the door to misinformation.  During the pandemic, people want answers so they can feel in control of their lives. Agents of disinformation or those with an agenda use these opportunities to swoop in and take advantage. Gail MacCallum, editor of science magazine Cosmos, noted that people seek certainty in uncertain times, and this is fundamentally not what science can provide.

“Scientists don’t speak in certainty because they can’t predict the future,” MacCallum said.

“What [scientists] can do is an immense amount of research, test that research, re-apply that research and see if they get the same result.”

But MacCallum added that science reporting needs an overhaul and must be done in an exciting way. “Then we have a chance at getting people to understand the whole method and process,” she said.

Holly Nott, managing editor at Australian Associated Press and founder of AAP Fact Check, noted that society needs to understand what journalism actually is. Nott echoed the words of former ABC Editorial Director Alan Sunderland, known for his guidance on the discipline of objectivity and impartiality in journalism as something that must be worked at.

“If you want to fight for a cause, something you believe in — that’s activism, not journalism. But if you want to correct the record and provide factual information to the masses, then you can do that through objective journalism,” Nott said. “And what we do as fact checkers is just good old- fashioned journalism when you drill right down to the basics.”

First Draft conducted large-scale research late last year amid growing concern that dangerous narratives and conspiracy theories might  increase vaccine skepticism.

The analysis of posts from Twitter, Instagram, Facebook Groups and Facebook Pages found two dominant themes across the languages studied. The first was to sow mistrust focused on political and economic motives — for example, questioning financial or power interests of governments. The second focused on the safety and efficacy of vaccines.

Understanding that these themes can be used for misinformation is important in preparing journalists, platforms and governments to be pre-emptive in their approaches.

But fact checking and science reporting are resource-heavy tasks, especially in terms of staff time. MacCallum noted the importance of putting numbers into perspective in discussions about rare blood clots and the AstraZeneca vaccine.

“You know, I can roll off these numbers, but they’re very hard-won … to find those numbers it takes time, it takes effort, it takes people hours and phone calls, it’s not easy to type into a search engine and get that answer out.

“How do we get our information out of the science and into the stories that are day to day?”

Our new daily grind for the attention economy

Associate Professor Adam Dunn, Director of Biomedical Informatics and Digital Health, University of Sydney, said journalism and science can adopt skills from each other:

“We might be able to do a better job of where we collect up information more quickly, and then make sense of it, and put it together on a page.”

And the collaboration depends on stakeholders and society knowing how to deal with change. As MacCallum explained, more data or new information creates new questions.

“As the virus changes, as the information changes, we will learn something different. And I think we should be clear with people about what the science is, what they’ve discovered, why it matters and the questions that need to be answered next.”

Dunn suggested a more agile approach be developed and adapted by stakeholders out of what academia calls systematic reviews.

“The job is to make a robust version of all of the available evidence and we go out and find everything we possibly can and stick it together and synthesize it in ways that answer a question.

“And the best versions of systematic review that exist now are ones that are kind of living, that are updated over time as things change and as the evidence changes that might change the conclusions of the systematic review as well. But the best versions of these things require lots of collaboration from lots of people, lots of extra work from machine learning methods and AI.

“I can imagine that a living document that describes a particular interesting question would be one that also gets a lot of page views.”

Dunn also called for more data and research on “the attention economy” and the need to understand the difference between counting misinformation and estimating information diets (all the places people access or receive information each day) and impacts on people’s behaviors.

“I’ve been kind of annoyed at this for a really long time that so many of the data-driven studies in my area in social media research just count up how much misinformation is out there and who posts it, and then they speculate on its impact and importance,” Dunn said.

He added, “We’re still missing this bit to work out where, if and when social media platforms or the media should be taking action on misinformation — to actively go out there and correct it.  We need lots of examples of the trajectory of misinformation and we need to know if and how they led to changes of behavior — what the impact was.”

Josh Machin, head of policy for Facebook Australia said he was looking for a sense of where more collaboration between the different stakeholders can be built. He acknowledged there was a need to measure the efficacy of different interventions to combat misinformation. Australia’s new Code of Practice on Disinformation and Misinformation is where further collaboration between platforms, government, industry and academia may move the needle to combat online misinformation. Specifically for researchers, Objective 6 of the code includes:

Signatories commit to support and encourage good faith independent efforts to research Disinformation and Misinformation both online and offline. Good faith research includes research that is conducted in accordance with the ethics policies of an accredited Australian University.

Lastly, panelists agreed that more focus must be given to collaborations with media literacy efforts. Dunn noted it is well within the realm of technological possibility for researchers to build a browser plug-in that highlights words or phrases in text that could signal problematic claims or inferences and help spark critical thinking.

“We can literally build these tools right now, but what we don’t have are the data and/or the platforms to test interventions in real-world scenarios to really measure impact on attitudes and behaviors in ways that are ethical, independent and within informed consent,” he said.

Panelists:

Anne Kruger, First Draft APAC Director (MC and panelist)

Holly Nott, managing editor at Australian Associated Press who founded AAP Fact Check

Gail MacCallum, editor of science magazine Cosmos

Associate Professor Adam Dunn, Director of Biomedical Informatics and Digital Health, University of Sydney

The virtual panel was hosted with support from Facebook Australia on March 31.