Eryn Newman, Amy Dawel, Madeline Claire Jalbert and Norbert Schwarz* say media mythbusting can actually make false beliefs stronger.
As the COVID-19 pandemic has swept the world, politicians, medical experts and epidemiologists have taught us about flattening curves, contact tracing and growth factors.
At the same time, we are facing an “infodemic” — an overload of information, in which fact is hard to separate from fiction.
Misinformation about coronavirus can have serious consequences.
Widespread myths about “immune boosters”, supposed “cures”, and conspiracy theories linked to 5G have already caused immediate harm.
In the long term, they make may people more complacent.
Social media companies are working to reduce the spread of myths and mainstream media and other information channels have in many cases ramped up efforts to address misinformation.
But these efforts may backfire by unintentionally increasing public exposure to false claims.
The ‘myth vs fact’ formula
News media and health and wellbeing websites have published countless articles on the “myths vs facts” about coronavirus.
Typically, articles share a myth in bold font and then address it with a detailed explanation of why it is false.
This communication strategy has been used in attempts to combat other health myths such as the ongoing anti-vaccine movement.
One reason for the prevalence of these articles is that readers actively seek them out.
The Google search term “myths about coronavirus”, for example, saw a prominent global spike in March.
Debunking false information, or contrasting myths with facts, intuitively feels like it should effectively correct myths.
But research shows that such strategies may actually backfire, by making misinformation seem more familiar and spreading it to new audiences.
Familiarity breeds belief
Cognitive science research shows people are biased to believe a claim if they have seen it before.
Even seeing it once or twice may be enough to make the claim more credible.
This bias happens even when people originally think a claim is false, when the claim is not aligned with their own beliefs, and when it seems relatively implausible.
What’s more, research shows thinking deeply or being smart does not make you immune to this cognitive bias.
The bias comes from the fact humans are very sensitive to familiarity, but we are not very good at tracking where the familiarity comes from.
One series of studies illustrates the point.
People were shown a series of health and wellbeing claims one might typically encounter on social media.
The claims were explicitly tagged as true or false.
When participants were asked which claims were true and which were false immediately after seeing them, they usually got it right.
But when they were tested a few days later, they relied more on feelings of familiarity and tended to accept previously seen false claims as true.
Older adults were especially susceptible to this repetition.
The more often they were initially told a claim was false, the more they believed it to be true a few days later.
For example, they may have learned that the claim “shark cartilage is good for your arthritis” is false.
But by the time they saw it again a few days later, they had forgotten the details.
All that was left was the feeling they had heard something about shark cartilage and arthritis before, so there might be something to it.
The warnings turned false claims into “facts”.
The lesson here is that bringing myths or misinformation into focus can make them more familiar and seem more valid.
And worse: “myth vs fact” may end up spreading myths by showing them to new audiences.
What I tell you three times is true
Repeating a myth may also lead people to overestimate how widely it is accepted in the broader community.
The more often we hear a myth, the more we will think it is widely believed.
And again, we are bad at remembering where we heard it and under what circumstances.
For instance, hearing one person say the same thing three times is almost as effective in suggesting wide acceptance as hearing three different people each say it once.
The concern here is that repeated attempts at correcting a myth in media outlets might mistakenly lead people to believe it is widely accepted in the community.
Memorable myths
Myths can be sticky because they are often concrete, anecdotal and easy to imagine.
This is a cognitive recipe for belief.
The details required to unwind a myth are often complicated and difficult to remember.
Moreover, people may not scroll all the way through the explanation of why a myth is incorrect.
Complicated stories are hard to remember.
The outcome of such articles may be a sticky myth and a slippery truth.
Making the truth stick
If debunking myths makes them more believable, how do we promote the truth?
When information is vivid and easy to understand, we are more likely to recall it.
For instance, we know placing a photograph next to a claim increases the chances people will remember (and believe) the claim.
Making the truth concrete and accessible may help accurate claims dominate the public discourse (and our memories).
Other cognitive tools include using concrete language, repetition, and opportunities to connect information to personal experience, which all work to facilitate memory.
Pairing those tools with a focus on truth can help to promote facts at a critical time in human history.
* Eryn Newman is a Lecturer in the Research School of Psychology, The Australian National University. Amy Dawel is a clinical psychologist and Lecturer at The Australian National University. Madeline Claire Jalbert is a PhD Candidate in Social Psychology at the University of Southern California. Norbert Schwarz is Provost Professor of Psychology and Marketing and Co-Director of the Dornsife Mind & Society Center at the University of Southern California.
This article first appeared at theconversation.com.