OPINION: What do the following have in common? Pauline Hanson suggesting vaccines are dangerous; Pete Evans slamming sunscreen; Donald Trump telling us climate change doesn't exist; and your friend swearing that echinacea will cure your cold.
They are all incorrect claims – in some cases, outright lies. But the other thing they have in common is they are all persistent beliefs that are ingrained in a scarily high proportion of the population. What's worse, people will act on these beliefs. In 2013, when ABC's Catalyst program suggested that statins used in cholesterol management weren't effective or safe, thousands failed to get their prescriptions refilled and many continued to avoid the drugs even after the episodes were retracted.
Young science fan trolls anti-vaxxers
12-year-old science whiz Marco Arturo reveals the results of his two months of investigation into the purported link between vaccines and autism.
We live in the era of fake news and we're seeing a growing number of ill-founded claims about health, science and conspiracy. Lightning-quick news cycles and algorithm-determined social media feeds exacerbate the issue.
Unfortunately, fighting back against peddlers of misinformation cannot undo the damage done – they are even more dangerous than we think.
There is no simple cure once we are exposed, because the effects cannot be fully overcome by just promoting fact. Hanging on to mistaken beliefs or fictions occurs not just when people don't want to change their minds – our brains are actually bad at updating information even when we're trying to. Using the terminology of some researchers, misinformation is really "sticky".
So, what accounts for this "stickiness"? And what kind of solvents can we use?
For starters, media has a bias in favour of studies with exciting findings over those that fail to find an effect or simply replicate previous work. This means we end up with an unbalanced view on issues where there might also be some contrary, less exciting evidence. Because this information is not visible, we are not as reserved in our appraisal of glamorous new findings as we should be. Instead, these exciting stories – for example, the apparent success of a new fad diet, or climate findings at odds with predicted trends – ignite and spread rapidly. This bias acts as an incentive for skewed reporting in the media and among researchers.
The scientific community is working to rectify this. For example, pre-trial registration, where researchers publish what they are going to do and what results they might expect prior to studies being run, is now encouraged. This stops researchers from manipulating analyses and fudging theories to fit with their data after the fact. Importantly, it also makes it harder to hide "uninteresting" or conflicting results by pretending the studies never happened.
We've got a lack of public interest in corrections. Inaccurate front-page stories are corrected with small notices buried in unread pages of newspapers. Misinformation on TV and radio – particularly on less scrupulous programs or channels – might not be corrected at all. Any corrections that are issued generally fail to gain the traction of the more alluring news items they amend. The media must place more importance on getting it right the first time, and increasing the space given to correcting mistaken reporting – no matter how boring.
This is undeniably a difficult task, especially with the dominance of online news that demands a continuous rollover of stories, resulting in a lack of time and resources for fact-checking and corrections.
In the form and presentation of corrections, we must use what we know about human cognition and its flaws. Research has shown the importance of making clear exactly what dud information is being addressed, but also, that the repetition of a myth when correcting it may reinforce it. That means we have to strike a delicate balance – make it clear, but don't give falsehoods much more time in the sun. Again, repetition of the good information is helpful. When a myth has been so oft- and long-repeated, it will be called to mind very easily. To have any chance of winning out, facts need the same repetitive treatment.
Similarly, credibility is key. A credible source makes a big impact on belief. The first point here is that reputable sources must try extra hard to minimise mistakes (the vaccine myth originated with a bogus study in a top medical journal). Second, when it comes to correcting, the source has to be perceived as trustworthy and expert. Unfortunately, we have a real asymmetry in this respect: companies will pay big bucks for celebrity and expert endorsements, but these advocates are nowhere to be found when corrections come later on.
A major finding from the research is that misinformation lingers when the correction is causally unsatisfying. The brain seems to prefer an incorrect but complete picture than a hole-riddled truth. The classic scenario used in studies provides causes of a warehouse fire to readers, and then retracts them. So let's say the paper prints a story attributing the fire to arson. The research shows that, in order to correct the allegation, ruling out arson is not enough – some other plausible cause for the fire must be offered as a substitute. Again, thorough, clear reporting is crucial.
We can never fully eliminate the impact of misinformation. People and institutions in positions of influence should try harder to put out only truth, because we are much better at learning than unlearning. But there will always be those who knowingly dress fiction as fact Science tells us how to loosen their grip on us.
Rachel Visontay works in psychology research at UNSW.
This opinion piece was first published in the Sydney Morning Herald.