The psychology of misinformation: Why we’re vulnerable
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

This website is hosted in perpetuity by the Internet Archive.

The psychology of misinformation: Why we’re vulnerable

How does our psychology make us more vulnerable to misinformation? We explain the key concepts in the first of a three-part series.

The psychology of misinformation — the mental shortcuts, confusions, and illusions that encourage us to believe things that aren’t true — can tell us a lot about how to prevent its harmful effects. Our psychology is what affects whether corrections work, what we should teach in media literacy courses, and why we’re vulnerable to misinformation in the first place. It’s also a fascinating insight into the human brain.

Though psychological concepts originate in academia, many have found their way into everyday language. Cognitive dissonance, first described in 1957, is one; confirmation bias is another. And this is part of the problem. Just as we have armchair epidemiologists, we can easily become armchair cognitive scientists, and mischaracterization of these concepts can create new forms of misinformation.

If reporters, fact checkers, researchers, technologists, and influencers working with misinformation (which, let’s face it, is almost all of them) don’t understand these distinctions, it isn’t simply a case of mistaking an obscure academic term. It risks becoming part of the problem.

We list the major psychological concepts that relate to misinformation, its correction, and prevention. They’re intended as a starting point rather than the last word — use the suggested further reading to dive deeper.

This is the first in our series on the psychology of misinformation. Read the second, “The psychology of misinformation: Why we’re vulnerable”, and the third, “The psychology of misinformation: How to prevent it“.

Cognitive miserliness

The psychological feature that makes us most vulnerable to misinformation is that we are ‘cognitive misers. We prefer to use simpler, easier ways of solving problems than ones requiring more thought and effort. We’ve evolved to use as little mental effort as possible.

This is part of what makes our brains so efficient: You don’t want to be thinking really hard about every single thing. But it also means we don’t put enough thought into things when we need to — for example, when thinking about whether something we see online is true

What to read next:How the Web Is Changing the Way We Trust” by Dario Tarborelli of the University of London, published in Current Issues in Computing and Philosophy in 2008.

Dual process theory

Dual process theory is the idea that we have two basic ways of thinking: System 1, an automatic process that requires little effort; and System 2, an analytical process that requires more effort. Because we are cognitive misers, we generally will use System 1 thinking (the easy one) when we think we can get away with it.

Automatic processing creates the risk of misinformation for two reasons. First, the easier something is to process, the more likely we are to think it’s true, so quick, easy judgments often feel right even when they aren’t. Second, its efficiency can miss details — sometimes crucial ones. For example, you might recall something you read on the internet, but forget that it was debunked.

What to read next: A Perspective on the Theoretical Foundation of Dual Process Models” by Gordon Pennycook, published in Dual Process Theory 2.0 in 2017.

Heuristics

Heuristics are indicators we use to make quick judgments. We use heuristics because it’s easier than conducting complex analysis, especially on the internet where there’s a lot of information.

The problem with heuristics is that they often lead to incorrect conclusions. For example, you might rely on a ‘social endorsement heuristic’ — that someone you trust has endorsed (e.g.,  retweeted) a post on social media — to judge how trustworthy it is. But however much you trust that person, it’s not a completely reliable indicator and could lead you to believe something that isn’t true.

As our co-founder and US director Claire Wardle explains in our Essential Guide to Understanding Information Disorder, “On social media, the heuristics (the mental shortcuts we use to make sense of the world) are missing. Unlike in a newspaper where you understand what section of the paper you are looking at and see visual cues which show you’re in the opinion section or the cartoon section, this isn’t the case online.”

What to read next: “Credibility and trust of information in online environments: The use of cognitive heuristics” by Miriam J. Metzger and Andrew J. Flanagin, published in Journal of Pragmatics, Volume 59 (B) in 2013.

Cognitive dissonance

Cognitive dissonance is the negative experience that follows an encounter with information that contradicts your beliefs. This can lead people to reject credible information to alleviate the dissonance.

What to read next: “‘Fake News’ in Science Communication: Emotions and Strategies of Coping with Dissonance Online” by Monika Taddicken and Laura Wolff, published in Media and Communication, Volume 8 (1), 206–217 in 2020.

Confirmation bias

Confirmation bias is the tendency to believe information that confirms your existing beliefs, and to reject information that contradicts them. Disinformation actors can exploit this tendency to amplify existing beliefs.

Confirmation bias is just one of a long list of cognitive biases.

What to read next: “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises” by Raymond Nickerson, published in Review of General Psychology, 2(2), 175–220 in 1998.

Motivated reasoning

Motivated reasoning is when people use their reasoning skills to believe what they want to believe, rather than determine the truth. The crucial point here is the idea that people’s rational faculties, rather than lazy or irrational thinking, can cause misinformed belief.

Motivated reasoning is a key point of current debate in misinformation psychology. In a 2019 piece for The New York Times, David Rand and Gordon Pennycook, two cognitive scientists based at the University of Virginia and MIT, respectively, argued strongly against it. Their claim is that people simply aren’t being analytical enough when they encounter information. As they put it:

“One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.”

Rand and Pennycook are continuing to build a strong body of evidence that lazy thinking, not motivated reasoning, is the key factor in our psychological vulnerability to misinformation.

What to read next: Why do people fall for fake news?” by Gordon Pennycook and David Rand, published in The New York Times in 2019.

Pluralistic ignorance

Pluralistic ignorance is a lack of understanding about what others in society think and believe. This can make people incorrectly think others are in a majority when it comes to a political view, when it is in fact a view held by very few people. This can be made worse by rebuttals of misinformation (e.g., conspiracy theories), as they can make those views seem more popular than they really are.

A variant of this is the false consensus effect: when people overestimate how many other people share their views.

What to read next: The Loud Fringe: Pluralistic Ignorance and Democracy” by Stephan Lewandowsky, published in Shaping Tomorrow’s World in 2011.

Third-person effect

The third-person effect describes the way people tend to assume misinformation affects other people more than themselves.

Nicoleta Corbu, professor of communications at the National University of Political Studies and Public Administration in Romania, recently found that there is a significant third-person effect in people’s perceived ability to spot misinformation: People rate themselves as better at identifying misinformation than others. This means people can underestimate their vulnerability, and don’t take appropriate actions.

What to read next: “Fake News and the Third-Person Effect: They are More Influenced than Me and You” by Oana Ștefanita, Nicoleta Corbu, and Raluca Buturoiu, published in the Journal of Media Research, Volume. 11 3 (32), 5-23 in 2018.

Fluency

Fluency refers to how easily people process information. People are more likely to believe something to be true if they can process it fluently — it feels right, and so seems true.

This is why repetition is so powerful: if you’ve heard it before, you process it more easily, and therefore are more likely to believe it. Repeat it multiple times, and you increase the effect. So even if you’ve heard something as a debunk, the sheer repetition of the original claim can make it more familiar, fluent, and believable.

It also means that easy-to-understand information is more believable, because it’s processed more fluently. As Stephan Lewandowsky and his colleagues explain:

“For example, the same statement is more likely to be judged as true when it is printed in high- rather than low-color contrast … presented in a rhyming rather than non-rhyming form … or delivered in a familiar rather than unfamiliar accent … Moreover, misleading questions are less likely to be recognized as such when printed in an easy-to-read font.”

What to read next: “The Epistemic Status of Processing Fluency as Source for Judgments of Truth” by Rolf Reber and Christian Unkelbach, published in Rev Philos Psychol. Volume 1 (4): 563–581 in 2010. 

Bullshit receptivity

Bullshit receptivity is about how receptive you are to information that has little interest in the truth; a meaningless cliche, for example. Bullshit is different from a lie, which intentionally contradicts the truth.

Pennycook and Rand use the concept of bullshit receptivity to examine susceptibility to false news headlines. They found that the more likely we are to accept a pseudo-profound sentence (i.e., bullshit) such as, “Hidden meaning transforms unparalleled abstract beauty,” the more susceptible we are to false news headlines.

This provides evidence for Pennycook and Rand’s broader theory that susceptibility to false news comes from insufficient analytical thinking, rather than motivated reasoning. In other words, we’re too stuck in automatic System 1 thinking, and not enough in analytic System 2 thinking.

What to read next: Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking” by Gordon Pennycook and David Rand, published in Journal of Personality in 2019. 

Look out for parts two and three and stay up to date with First Draft’s work by becoming a subscriber and following us on Facebook and Twitter.