As the UK enters the final week of its most tensely fought general election in years, experience tells us the amount of spin, propaganda and outright disinformation is likely to increase.
Dodgy claims about polling station changes or voter ID requirements; viral screenshots of fabricated tweets; fanciful campaign pledges far removed from fact; and dirty tricks to turn marginal seats are all common themes from previous elections around the world. The last five-and-a-half weeks have shown the UK to be no different.
A lot of this has to do with how social media works and the way some people take advantage of it. It is far, far easier to share something on Facebook or Twitter than it is to double check whether it is true.
Disinformation is like a virus and social media platforms ensure all their users are cramped together, living on top of each other, cheek-to-cheek and synapse-to-synapse. Just like a winter cold makes us sneeze and spread the cold virus further, disinformation is designed to trigger a strong emotional response so people share it — either out of outrage, fear, humour, disgust, love or anything else from the whole range of human emotions.
What’s more, our brains are more biased to accept something if it confirms our existing views. Something doesn’t have to be true to feel like it is, or like it should be. And those trying to push an agenda or twist events take advantage of this fact, layering lies upon lies or taking a kernel of truth and wrapping it up in misleading claims and false narratives.
So if we see something online — something that both seems like it should be true and touches an emotional hotspot — it triggers an almost involuntary reaction to share it and tell people about it.
We can have all the checks, balances, tools, lists, websites and advice in the world but it is the right mindset which is crucial to not get tricked.
So recognise that emotional response. Question whether you might be getting played. Then run through some quick checks to make sure: source, history, evidence, emotion and pictures. Yes, that spells “sheep”.
And it’s easy to be part of the herd. It’s our natural state, after all. But it’s never been easier for those with power and money and a little know-how to take advantage of that herd mentality to manipulate the masses to their will.
So, this election, don’t follow the herd. Think “Sheep” before you share.
Source
Who is the source? Check the about page on a website or Facebook Page, look at any account information and search for any names or usernames. Are there any clear connections to political parties, candidates or campaign groups? Does the account or website say that its output is parody or satire?
A lot of dodgy accounts, pages and sites get away with seeming legitimate. But scratching the surface and looking at what lies beneath can often reveal the truth.
History
What kind of stories or material does the source regularly promote? Is it always attacking one side of the debate or promoting another? Does it consistently focus on just one issue?
These can all be signs that the source is trying to push a certain agenda. It is very easy to create a warped and inaccurate view of the world by cherry-picking a few news stories which point in one direction.
Evidence
Is there any evidence for what a source is saying? This may seem obvious, but the issues of emotion and confirmation bias tend to short circuit our brains and critical thinking.
So can we find evidence elsewhere which supports or knocks down a claim? Remember to conduct the same checks on any other sources as well. The most determined hoaxers or propagandists can use a network of sites or accounts to make a claim seem more widespread or well-supported than it really is.
Emotion
Does the source use overly emotive posts? Sensational or inflammatory language have always been crucial in grabbing people’s attention, but on social media those tactics are supercharged.
So called “moral-emotional language” attached to political messages has been found to act like a rocket booster on platforms like Facebook and Twitter when it comes to how and why posts are shared.
Anyone who has been in a passionate argument knows that when our emotional buttons are pushed it overloads the sensible, analytical parts of our brains. People trying to trick us on social media know this.
So if we have a strong emotional reaction to something, even when it was shared by a friend or family, we also need to think: Is this true? Am I about to be tricked?
Pictures
Are there more pictures than text? A picture paints a thousand words, of course, and visuals often bypass our critical thinking and transmit a message much more quickly than text can.
Again, people who are trying to push a message use this to their advantage. Memes often use an emotionally resonant image with a small amount of text to make sure the message hits its target. Often that target is just to make people laugh. But we’re increasingly seeing political campaigners use memes, pictures and video to push their agenda, and sometimes the messages they’re sharing aren’t true.
Lies are as old as humankind, but the way they are delivered and injected straight into our brains as taken on a new, dystopian turn. If we are going to resist such manipulation we need to be more mindful of what we share, more conscious of the way technology can be used to manipulate us.
This election, and every election after, let’s change our mindset for the information we consume. Think SHEEP before you share.
Stay up to date with First Draft’s work by becoming a subscriber and follow us on Facebook and Twitter.