Is there a psychological vaccine against untruths?

The Continued Influence Effect (CIE) suggests that people continue to rely on misinformation even after it has been corrected because once misinformation fills a gap in our mental model, it's hard to erase. If we accept that a widely publicised claim is false, we are left without a satisfying alternative explanation, and the original, even if incorrect, remains cognitively comfortable (Rich & Zaragoza, 2016).

The role of source credibility in belief updating

As part of my PhD research at University College London (UCL), published in Cognition (Sanna & Lagnado, 2025), I investigated how the perceived reliability of a source affects belief revision. What if people don't reject misinformation because they are incapable of updating their beliefs, but because they don't trust the correction?
These findings challenge the assumption that people cling to misinformation because they're biased or irrational. Instead, they seem to apply a consistent logic to belief updating, weighing new information based on who delivers it and on previous evidence collected about the claim.
These aren't surprising findings.

It's hard to remove misinformation if you don't have a compelling new story - and that's a problem for us arguing against BPS ideas without knowing what actually causes ME/CFS.

And it matters who delivers the attempted correction of misinformation. Again, that counts against us, as the people with authority and plenty of money to get their story out have tended to favour the BPS ideas. Having ME/CFS is often equated with irrationality, with perceiving the world incorrectly, so even an authority figure like a doctor loses credibility if they become a person with ME/CFS. And there's the issue of men often being seen as being more credible - so women with ME/CFS and men with this 'women's disease' have that extra hurdle of lack of credibility.


If we want to tackle misinformation, we need to move beyond simply providing factual corrections. Instead, we should focus on making corrections persuasive by considering:

  • Source credibility – Are corrections coming from sources people already trust?
  • Worldview – Is the message aligned with individuals' pre-existing beliefs?
  • Alternative explanations – If misinformation is filling a knowledge gap, what better explanation can we offer?
we should support broader efforts to foster critical thinking and what psychologists call epistemic humility: the ability to recognise our own limits and remain open to the possibility that we might be wrong (Karabegovic & Mercier, 2024).
These skills are not just about identifying errors. They build a mindset of curiosity, reflection, and self-correction; key traits in an age of information overload. Cultivating these habits early could help create a generation better equipped to navigate a complex and often misleading digital world (Lewandowsky et al., 2017).

A problem is that people with ME/CFS with positions of responsibility have often felt that they have needed to hide the fact that they have ME/CFS in order to not face prejudice. And so that opportunity for people with that credibility to influence what people believe about ME/CFS is lost.
 
I believe that schools should provide mandatory courses in rational thinking, and how misinformation is used to manipulate people. That would be far more useful than memorizing the names and dates of political leaders. Of course the manipulation industry would oppose that. Hmmm, so would many politicians, and certainly the BPS crowd.
 
I believe that schools should provide mandatory courses in rational thinking, and how misinformation is used to manipulate people. That would be far more useful than memorizing the names and dates of political leaders. Of course the manipulation industry would oppose that. Hmmm, so would many politicians, and certainly the BPS crowd.
Yeah, that would involve thinking critically about scenarios, and some people wouldn’t like the conclusion..
 
I believe that schools should provide mandatory courses in rational thinking, and how misinformation is used to manipulate people. That would be far more useful than memorizing the names and dates of political leaders. Of course the manipulation industry would oppose that. Hmmm, so would many politicians, and certainly the BPS crowd.
I went to a liberal arts college where these skills were baked into the curriculum, and multiple courses required careful examination of evidence, learning how to see holes in arguments, etc.

Ironically, it didn't seem to make anyone more resistant to political propaganda, open-minded, or able to examine their own biases. The bigots remained bigots. They just got more skilled at argumentation. And I have to emphasize that the courses did what they were designed to do--we all walked away with very good skills in how to pick apart biases from an argument presented to us. We were even tasked with trying to poke holes in the arguments we personally held dear. But I think it's a different beast entirely to actually apply those skills to your own biases consistently in your life.

I've seen it often enough and yet somehow it keeps catching me utterly off guard--people who demonstrate incredible critical thinking in one domain are suddenly unable to examine even their most simple biases as soon as it is in a different context. They will go above and beyond to argue against you, utterly believing that they are the peak of rationality--sometimes even using the exact same logic that they soundly rejected in another context.

I wish I understood how it could be counteracted effectively. It's been an interesting exercise to look at the people who completely broke out of certain modes of propaganda and analyze how they did it. In all those cases, it seems like there's an important emotional component that I have yet to understand.
 
I really agree with what @jnmaciuch has said, and put very eloquently. Bottom line is there is no immunity to this. Every one of us is susceptible and every one of us will and do fall into these traps. Sure, some go off the deep end, but recognising we’re all fallible and that you cannot beat emotion with rationality is probably the best we can do.

Interesting article and of course context. As @Hutan says we’ll be in a better place when we have better answers. But only I think with those who are not already convinced of certain ideas about ME/CFS. We will never change some people’s minds and probably shouldn’t waste our efforts trying.
 
Back
Top Bottom