A psychological vaccine against misinformation
On how to tackle misinformation and disinformation in the climate crisis.
Welcome back to Climate Psyched, the newsletter where we explore all things psychological, behavioral and emotional related to the climate and ecosystem crises.
Some time ago I had a discussion with my brother, who’s a journalist, about AI generated content on the internet and how to identify it. He posed the question of whether it’s easier to identify AI generated content when it makes up 30% of all internet content, or when it makes up 96%.
I’ve been thinking a lot about that, and about the question of how to best identify and tackle misinformation and disinformation. Is it easier to navigate a world where 30% of everything we see and read online is made up of deep fakes, bot comments, and texts and pictures from generative AI - or one where nearly everything is? How can discern what to believe when there’s always a potential chance of someone using falsehoods, technology and manipulative techniques to amplify their message?
I’ve recently read the excellent book Foolproof by social psychologist Sander van der Linden, which heavily has inspired this month’s post where we’ll look closer at misinformation and how to tackle it.
Situation
That climate obstructors and organized climate sceptics are using disinformation as a tactic to imbue doubt about the climate crisis and climate science is well known. But the problem is bigger than that.
A recent study that examined 32 million tweets from parliamentarians in 26 countries, spanning over 6 years and several election periods, could see that radical-right populism is the strongest determinant for the tendency to spread misinformation. Radical-right politicians are more prone to spreading misinformation than other politicians. The study also found that populism, left-wing populism, and right-wing politics are, in fact, not linked to the spread of misinformation. The results indicates that the increase of misinformation is connected to the current wave of far-right extremism. And that this political wave uses misinformation tactics to undermine liberal democratic institution. Other studies show that a small number of accounts are responsible for spreading the majority of misinformation on social media. So called superspreaders.
So if the spread of misinformation isn’t evenly distributed in the population, does that also mean that only a small group is susceptible to is, and that we shouldn’t worry to much about it? Unfortunately no one is immune to the influence of misinformation.
Explanation
In his book Foolproof, van der Linden comprehensively explains why the human mind is susceptible to misinformation, what tactics misinformation spreaders use, and how to build up immunity against it.
(Note: Misinformation and disinformation differ mainly due to intent: while misinformation is false or incorrect information regardless of intent, disinformation is false information coupled with an intention to deceive or harm others. Disinformation that’s used in the service of a political agenda is called propaganda. While debunking false information, proving the intent behind spreading misinformation is harder.)
Rather worringly, experiments show that people are quite bad at identifying fake information and fake ads and that a lot of misinformation spread is unintentional, but nonetheless harmful. Add to full lies the spread of half-truths, relativization of facts, misleading headlines and click-baits, and we have a bit of a shit storm in the midst of the other ongoing shit storm of massive parallell global crises. How come people are so susceptible to misinformation?
Can we really trust our senses?
The way we perceive the world is affected by prior knowledge, memories and experiences. That is wyh yuo can raed tihs - even though it’s misspelled.
Top-down cognition helps the brain fill in missing bits and gaps based on your experiences and expectations of what you should be seeing. It means that sometimes we see things that aren’t actually there, because we expect to see it - and because our brains are predictive. There are numerous optical illusions that exemplify this.
This is part of the explanation it matters not just what we’re exposed to, but also how many times a message, post or piece of news is repeated. The illusory truth effect means that people are prone to perceive often repeated statements as more likely to be true than less repeated ones. Repetition makes the brain acquainted to that which is being repeated, and when the brain becomes acquainted with something it tends to believe it more. Unfortunately, prior knowledge about the illusory truth effect is no cure for it! Neither is education. However, people with worse digital literacy, older people and people with extreme and right-wing political orientation generally tend to be more susceptible to misinformation.
The perceived truthfulness of repeated messages is a challenge since misinformation often offers enraging messages and simple answers to complex issues. Misinformation and disinformation is click-bait and rage-bait friendly while the truth and science often is more nuanced and complex. This means that misinformation has an advantage when it comes to being spread quickly and repetitively, increasing the risk that it sticks in our minds and affects our perception and judgement. In fact, misinformation has the potential of altering our memory, something that’s called the misinformation effect.
Debunking misinformation runs the risk of making it worse
Once exposed to misinformation, people seem to continue to be influenced by it, even after they’ve been exposed to corrections or retractions. It appears that corrections of misinformation get less reach, as well as less attention from us. This partly has to do with misinformation often playing on our emotions with the purpose of getting us enraged or scared, whereas corrections tend to be more factual and less emotionally triggering. Events that trigger emotions tend to stick better in our memory. If the misinformation also fits in with our established worldview, then we’re more likely to believe it, due to confirmation bias; the tendency to accept information that confirms our previous beliefs and reject that which doesn’t. This is why it’s so important to stop the spread of misinformation as early on as possible.
Due to all this it unfortunately seems that debunking on its own is an insufficient tool to counter the influence of misinformation. Some debunking and fact-checking tools actually strengthen peoples’ memory of the misinformation by repeating key aspects of the misinformation and triggering the illusory truth effect. For example: “Climate change is not a hoax” can unintentionally strengthen the association between the words climate change and hoax, making the initial misinformation stick even more. To get around this potential risk, van der Linden suggests to always emphasize consensus when correcting or debunking. Instead of putting the words climate change and hoax in the same sentence, push the message “97 percent of climate scientists agree that climate change is real”, which highlights scientific consensus. Put in other words: It’s not enough to rip out the weeds in your garden, you need to plant something else (i.e. credible alternative explanations) there instead, or the weeds will come back. Don’t repeat the misinformation, but do offer the truth. But is it really enough to just repeat the facts?
Just giving people more facts doesn’t build resistance against misinformation
A common tactic amongst climate sceptics and the climate obstruction movement is to use false balance, which unfortunately has been quite common in the media over the years. In attempts to be perceived as objective and neutral, media has oftentimes let opposing perspectives be heard in its reporting on climate. Whilst bringing in different voices generally is helpful in gathering a nuanced perspective on an issue, there’s a real problem when the world’s collective climate research is balanced with voices of climate sceptics expressing their own (and sometimes funded by the fossil industry) opinions – and the two are framed as equally sound and valid perspectives. It sows doubt, and doubt is an excellent tool in competing with facts and starting controversy over well-established topics.
Portraying issues like the climate crisis as an opinion issue with to even-keeled sides is potentially harmful. But merely providing people with more facts about the climate crisis isn’t enough to resist the harmful effects of false balance.
In experiments that van der Linden and colleagues have carried out they’ve seen that when people are exposed to both facts and misinformation side by side, the presence of misinformation cancels out the positive effects of the fact about scientific consensus on the climate crisis happening. Misinformation, with its mere presence, wipes out the effect of facts.
Inoculation of misinformation can boost psychological resistance
Since misinformation has the advantage of spreading quickly and stick once someone’s been exposed to it, those of us who wish to counter fake news, lies and half-truths need to find ways of getting ahead. But how do we do that if offering more facts is insufficient. Van der Linden suggests that another possibility is to psychologically vaccinate people by forewarning about the risks of being exposed to misinformation, paired with a weakened dose of the misinformation or common misinformation techniques, followed by stating facts that counter the misinformation. This is called psychological inoculation. Just like a biological vaccine, psychological inoculation appears to trigger ‘psychological antibodies’ and generate resistance against misinformation.
The theory of psychological inoculation has been tested in numerous studies over the past years and has been shown to work quite well. Instead of debunking it’s a form of prebunking. This prebunking also seems to give people tools, and confidence to both counter-argue and understand why certain claims are misleading, making them more prepared to resist the influence of misinformation.
Action
Sander van der Linden offers several concrete actions to counter the spread of misinformation by ‘psychologically vaccinate people’ to make them less susceptible to misinformation.
Repetition of truth matters as it makes the brain process it more fluently. The more familiar something becomes, the easier it is for the brain to process. More facts is however not enough to protect us from the influence of misinformation.
Psychological inoculation can be used in daily life, with friends, families and even on your social media. It can be seen as a rhetorical strategy (which unfortunately means that it also potentially could be used by people spreading misinformation for the wrong reasons)
There are two forms of inoculation: Fact-based and Technique-based
Fact-based inoculation focuses on a specific issue, e.g. climate change being a hoax. It’s effective, but quite specific and doesn’t necessarily generalize to other issues. Generally speaking it works with the following steps:
You become aware of misinformation about an issue, perhaps one that’s gone viral
Being aware makes you able to forewarn those around you, including on your social media, that they might be exposed to harmful misinformation that make false claims. (forewarning + inoculation with weakened dose)
You also explain why this is in fact false, you lay out the facts (exposure to correct facts)
This targets this specific issue, and can potentially create an ‘umbrella of protection’ against related misinformation; e.g. the inoculation might motivate people to seek out more information about vaccinations or make people more likely to question other misinformation about the issue.
Technique-based inoculation focuses on identifying and exposing misinformation techniques, rather than specific facts. It has the benefit of being broader, but the down-side of potentially being less effective. Generally it can work in the following steps:
Forewarn people that there are people and actors who in bad faith try to manipulate us by using clever techniques to spread misinformation.
Explain one or several of these techniques, e.g. the technique of discrediting criticism of the misinformation by attacking the source of the criticism, thereby deflecting attention from the actual criticism.
Expose those you’re talking with to examples of these misinformation techniques, e.g. fake headlines (this is the inoculation)
Instead of enabling people to resist and argue against a specific piece of misinformation by providing them with the refutation in advance, technique-based inoculation works by allowing people to identify the broader misinformation technique, alerting us to the possibility that we are being misled and as a consequence enhances the likelihood that we reject false information
Inoculation can be done in various creative ways, by using humor, videos or even games (as this study of the game Bad News shows)
Psychological inoculation seems to work even when someone’s already been exposed to the misinformation, although with less effect, so keep inoculating!
Just as vaccines require booster shots, so does this ‘psychological vaccine’. People tend to stay more immune to misinformation over time if they’re repeatedly and regularly exposed to inoculation, for example by a weekly fake news quiz.
There’s much more to say on the topic of misinformation, disinformation and conspiracy thinking. I plan on expanding this topic in the upcoming mid-month post for upgraded subscribers.
Upgrade to paid subscription
After some hesitation I a few months ago decided to enable the option of becoming a paid subscriber to Climate Psyched, for those who want to and are in a position to financially support this work. Writing Climate Psyched is a labor of love and one that truly feels important, but it is nonetheless labor that takes a substantial amount of time and effort. If you do become a paid subscriber you will not only make me feel incredibly honored, you will also receive an extra post per month.
These monthly posts will still be available free for everyone - it feels important to not exclude those who don’t have the financial space, as well as acknowledging that support for someone’s work can come in many different forms.