Debunking Myths Doesn't Convince People
Christopher Graveson 24 February, 2015 at 11:02
Seventeen years ago, a physician in the UK published a study of twelve children who had been given the MMR (measles, mumps, rubella) vaccine. It implied a scary correlation between the vaccine and autism. But upon both further investigation and further clinical studies, the original finding was thrown out, the medical journal retracted the article, and the doctor was found to have unethical financial interests in the findings. Finally, he was stripped of his license to practice medicine.
Yet today, surveys show that as many as a third of U.S. parents still believe the discredited allegations. One in five millennials believe early childhood vaccines cause autism and 26% of parents trust a celebrity as a credible source on vaccine safety. And measles, declared officially eradicated in the U.S. in the year 2000, is back.
This has all led to a contentious debate among parents, politicians, and the medical community. But as someone who does communications for a living, I have a professional interest in this as well. Why it is so difficult to dispel rumors and debunk myths? I see four major reasons.
1. Arguing the facts doesn’t help—in fact, it makes the situation worse.
In 1979, Charles Lord performed a seminal piece of research that revealed when you show someone factual, scientific evidence that they are wrong, they react badly. They will only accept the evidence that fits their pre-existing views. Lord called this effect “confirmation bias”. There have been hundreds of studies since, all finding the same results: when you argue using facts and evidence, people generally reject or discount your evidence. Instead of changing their minds, most will dig in their heels and cling even more firmly to their originally held views. Brendan Nyhan of Dartmouth and Jason Reifler of the University of Exeter have also documented an even more alarming tendency, which they call “the backfire effect”. In their study, correcting people actually increased their misperceptions.
2. Repeating the myth inadvertently popularizes it.
When you repeat the myth while trying to debunk it, you do two things: first you introduce the myth to people who may never have heard it. As many as 40% then believe it. Second, by repeating the myth, you unintentionally convert “false claims into recommendations” as one study shows. They found that after three days, older adults misremembered 28% of false statements as being true, but once the false statements were repeated three times, the number of people who confused them as being true jumped to 40%. So repeating wrong information, even to debunk it, backfires.
3. Affirmation works – but we rarely use it.
When someone has their facts wrong, our impulse is generally not to tell them we think they’re great. We’re more likely to go on the attack.
But Nyhan and Reifler, in their research on misperceptions and corrections, discovered that when people who are mistaken undergo a self-affirmation exercise, it increases the odds that they’ll accept the corrected information. That means you may have more success changing minds after making your audience feel good about themselves.
Part of the polarization happening in the vaccine debate stems from the vilifying of those parents who have not vaccinated their children. Instead of going on the attack, pro-vaccine advocates should reassure parents that we know they love their children. Treating them as idiots or fringe lunatics will only worsen the discussion about the science.
4. We consistently underestimate the power of narrative.
In the 1940s, Austrian psychologist Fritz Heider created what has become a legendary piece of research that reveals the need for people to craft narratives. Heider created a short, simple animation of two triangles, a rectangle and a circle. His subjects all (except one) read a complete drama into the animation, complete with love affairs and bullying. Humans, it seems, must have a story line, and in a void, will create one.
In the vaccine wars, the anti-vaccination movement has referenced many narratives. Each narrative is different, but each one features a protagonist and a villain, along with attempts to bury or distort the “facts.”
Model, actress, and celebrity activist Jenny McCarthy has been one of the most outspoken critics of vaccinations in the U.S. She has drawn upon a personal narrative of her son, whom she claims was rendered autistic by a vaccination but later cured through organic and holistic approaches. McCarthy leverages a very powerful technique known to social scientists as the “identifiable victim effect.”
It turns out human empathy does not scale well. We can care very deeply about one, single stranger, but that empathy wanes rapidly as the group of victims grows. Once it becomes a large number we cease caring. By repeating her emotional story, McCarthy draws upon this psycho-sociological phenomenon. And when put head-to-head with scientists talking about immunology, the story of her child will sway audiences every time. When challenged on her lack of scientific proof, McCarthy has retorted, “Evan is my science.”
Think of all the times you’ve tried to clarify something by saying, “It’s X, not Y.” Or of all the times you’ve resorted to the facts to win an argument. Or of how tempting it is to tease or belittle someone you know is in the wrong.
We all fall into these traps. But all you’re doing is reinforcing the information you don’t want people to hear, increasing their resistance to your message, and making it harder for them to hear you in the first place.
Instead, we need to dispel rumors and debunk myths without repeating the misinformation; rely on a compelling, proactive narrative; and realize that more information is not necessarily better.
This essay first appeared in Harvard Business Review (https://hbr.org/2015/02/why-