The psychology of misinformation — the mental shortcuts, confusions, and illusions that encourage us to believe things that aren’t true — can tell us a lot about how to prevent its harmful effects. It’s what affects whether corrections work, what we should teach in media literacy courses, and why we’re vulnerable to misinformation in the first place. It’s also a fascinating insight into the human brain.
In the third part of this series on the psychology of misinformation, we cover the psychological concepts that are relevant to the prevention of misinformation. As you’ll have seen from the psychology of correcting misinformation, prevention is preferable to cure.
Here we explain the psychological concepts that can help us by building our mental (and therefore social) resilience. What you’ll find is that many of the resources we need to slow down misinformation are right there in our brains, waiting to be used.
This is the third in our series on the psychology of misinformation. Read the first, “The psychology of misinformation: Why we’re vulnerable”, and the second, “The psychology of misinformation: Why it’s so hard to correct”.
Skepticism
Skepticism is an awareness of the potential for manipulation and a desire to accurately understand the truth. It is different from cynicism, which is a generalized distrust.
Skepticism involves more cognitive resources going into the evaluation of information, and as a result can lower susceptibility to misinformation. It can be contrasted with ‘bullshit receptivity’ and contributes to Gordon Pennycook and David Rand’s thesis that susceptibility to misinformation derives not from motivated reasoning (persuading yourself something is true because you want it to be), but from a lack of analytic thinking.
What to read next: “Misinformation and its Correction: Cognitive Mechanisms and Recommendations for Mass Communication” by Briony Swire and Ullrich K.H. Ecker, published in Misinformation and Mass Audiences in 2018.
Emotional skepticism
Emotional skepticism is an awareness of potential manipulation through your emotions. It might involve taking a moment to calm down before sharing a shocking but false post.
Despite emotion being a strong driver of shares on social media, and therefore a powerful driver in disinformation campaigns, it is often overlooked in media literacy campaigns. More research is needed to understand what techniques can cultivate emotional skepticism, and how this can slow down the sharing of misinformation.
What to read next: “Reliance on emotion promotes belief in fake news” by Cameron Martel, George Pennycook, and David G. Rand, (preprint) in 2019.
Alertness
Alertness is a heightened awareness of the effects of misinformation.
In 2010, misinformation researcher Ullrich Ecker and colleagues found that warning people about the effects of misinformation, such as the continued influence effect, can make them more alert. By being alert to them, the effects of misinformation are reduced.
What to read next: “Explicit warnings reduce but do not eliminate the continued influence of misinformation” by Ullrich K.H. Ecker, Stephan Lewandowsky, and David T.W. Tang, published in Memory and Cognition 38, 1087–1100 in 2010.
Analytic thinking
Analytic thinking, also known as deliberation, is a cognitive process that involves thoughtful evaluation rather than quick, intuitive judgements.
Taking more than a few more seconds to think can help you spot misinformation. Misinformation researchers found that ‘“analytic thinking helps to accurately discern the truth in the context of news headlines.”
What to read next: “Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines” by Bence Bago, David G. Rand and George Pennycook, (preprint) in 2019.
Friction
Friction is when something is difficult to process or perform, such as through a technical obstacle like a confirmation button. It is the opposite of fluency.
Introducing friction can reduce belief in misinformation. Lisa Fazio, a researcher based at Vanderbilt University, has found that if you create friction in the act of sharing, such as by asking people to explain why they think a headline is true before they share it, they’re less likely to spread misinformation.
What to read next: “Pausing to consider why a headline is true or false can help reduce the sharing of false news” by Lisa Fazio, Harvard Kennedy School Misinformation Review, in 2020.
Inoculation
Inoculation, also known as ‘prebunking’, refers to techniques that build pre-emptive resistance to misinformation. Like a vaccine, it works by exposing people to examples of misinformation, or misinformation techniques, to help them recognize and reject them in the future.
Inoculation has been found to be effective in reducing belief in conspiracy theories and increasing belief in scientific consensus on climate change.
What to read next: “Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence” by John Cook, Stephan Lewandowsky, and Ullrich K.H. Ecker, published in PLOS ONE 12 (5) in 2017.
Nudges
Nudges are small prompts that subtly suggest behaviors. The concept emerged from behavioral science and in particular the 2008 book “Nudge: Improving Decisions About Health, Wealth, and Happiness.”
When it comes to building resilience to misinformation, nudges generally try to prompt analytic thinking. A recent study found that nudging people to think about accuracy before sharing misinformation significantly improves people’s discernment of whether it is true.
What to read next: “Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy nudge intervention” by George Pennycook, Jonathan McPhetres, Yunhao Zhang, Jackson G. Lu, and David G. Rand, (preprint) in 2020.
Stay up to date with First Draft’s work by becoming a subscriber and following us on Facebook and Twitter.