It Doesn't Matter if You're Wrong

If facts don't help, what does?

It Doesn't Matter if You're Wrong
Photo by Pablo García Saldaña on Unsplash

A woman named Dorothy Martin started hearing voices. At first it was dead relatives. Then, space aliens warned her about the end of the world. There was going to be a flood. She needed to round up everyone she cared about. A flying saucer was coming to save them. Martin did everything she was told. She attracted a large following. Many of them gave up all of their belongings and even sold their homes. For several nights, they waited.

The saucer never came.

Instead of giving up, Martin and her followers preached even louder. They did more interviews. They sent more statements to the press. They tried ever harder to recruit new members. They came up with excuses for why the aliens never came. Maybe they weren't following their instructions.

Maybe they'd done such a good job, their devotion had convinced God to spare the world and postpone the catastrophic flood.

They couldn't possibly be wrong.

A social psychologist named Leon Festinger followed the group, secretly recording their interactions. Afterward, he published When Prophecy Fails. Dorothy Martin was just one of hundreds of failed prophets going back hundreds of years. Every single time, the failed prophecy only hardened the beliefs of core members and drove them deeper and deeper into their fantasies. Sometimes, they gained more followers by sending out more recruiters.

Festinger was fascinated by the idea that being wrong often reinforced someone's beliefs. When he moved to Stanford, his team conducted even more studies to find out how people reacted when reality broke their expectations. In one case, he found that if you paid someone less money to complete boring repetitive tasks, they were more grateful. They were more likely to describe how interesting it was, or how much fun they had.

Festinger published another book, A Theory of Cognitive Dissonance. Here, he explained that most of us have a tendency to look for any reason to alleviate the discomfort of being wrong. We especially look for ways to minimize the gap between our values and our actions.

Since then, dozens of major studies have observed the lengths people will go in order to avoid admitting they were wrong.

People routinely seek out information that validates what they want to believe. They reject or ignore facts that contradict their beliefs. They question your credibility. They cover up one lie or mistake with a bigger one. They'll seek validation and approval for it all.

Showing someone conflicting information often backfires. In 1975, C.D. Batson conducted a study on 50 high school girls, measuring their faith and then showing them an article debunking the divinity of Jesus. After reading the article, the girls who believed in the accuracy of the article reported an intense increase in their faith, not a decline.

It's the opposite of what you'd expect.

A major study in the Journal of Personality and Social Psychology found that when someone has a strong opinion on something, they accept all evidence in favor of their original belief at face value. They don't question it. They subject conflicting evidence to a much higher standard. Doing that allows them to reject or dismiss the counter-evidence.

A 2020 article by researchers in Australia found something similar. After reading about Donald Trump's immoral and illegal acts, his supporters were more likely to focus on the wrongdoings of Trump's opponents. It didn't matter if they believed Trump had done anything wrong.

It gets worse.

A 2010 study by Yale law professor Dan Kahan found that our cultural values and worldviews influence who we regard as experts and authorities in the first place. According to Kahan, you can classify most people as individualists or communitarians, and either hierarchical or egalitarian. Most people tend to reject the authority of experts who don't align with their cultural values and orientations to the world. The topic doesn't matter.

In Kahan's study, only 23 percent of hierarchical individualists would recommend a book by an expert who expressed belief in the urgency of climate change, even if that expert was a member of the National Academy of Sciences and a professor at an elite university. You can extrapolate from there. If someone's a hierarchical individualist, they're probably going to reject evidence that vaccines or masks work, along with air purifiers. They're more likely to dismiss warnings about climate change or collapse. They aren't going to listen to any experts who come off as too liberal or communitarian.

That bias extends to the platforms and sources they trust.

Scientists have also figured out that we apply fight-or-flight responses not just to physical threats, but to facts themselves. If we encounter a piece of information that threatens us, we're likely to fight the information itself, not what the information warns us about. It's similar to another negative reaction, called spontaneous trait transferal, when we attack someone for trying to warn us about a threat rather than respond to the actual threat.

A 2012 study even found that some people derive a sense of power and control by refusing to apologize, even when they know they did something wrong. As we've all observed, some people would rather suffer or even die than admit they made a mistake. That stems from a fragile ego.

It explains a lot.

People often don't think, per se. They rationalize and cherry-pick information that validates their prior beliefs and decisions. They regard new, contradictory information as a threat. Often, that new or threatening information only reinforces their original beliefs. It encourages them to go to even more extreme lengths to confirm what they already think they know. They reject information, experts, and sources that challenge their assumptions.

Western culture rewards politicians and public figures for never giving in, never compromising, and never admitting mistakes.

They punish the opposite.

We're conditioned to believe it's a form of weakness to admit you were wrong, even in the face of overwhelming evidence. It doesn't seem to matter how many articles and books come out. These problems persist.

Worse, we accuse each other of demonstrating these biases instead of examining our own thought processes and actions. If you show information on cognitive dissonance and confirmation bias to an anti-masker, they're just going to accuse us of failing to admit we're wrong.

The behavior of our governments and health institutions has only made things worse over the last few years.

Instead of admitting they were wrong about vaccines stopping the spread of certain diseases or reducing the risk of chronic illness, they've only doubled down on their lies and misinformation. They've responded with silence and tacit denial of their own misjudgments and mistakes when it comes to everything from masks to clean air, even declining to admit they were wrong when they ultimately start telling the truth. These refusals have only fed conspiracy theorists and further undermined trust in our institutions, making it all that much harder to respond to the threats in front of us.

I've often wondered if there's any point to debunking anti-vaxxers, anti-maskers, and continuing to share accurate information.

There is.

Lies work because they get repeated thousands of times a day. The only hope to combat lies lies in repeating the truth with more frequency and more accuracy. If we give up, then nothing will change. The research on belief simply explains why it's so hard and why it takes so much time and energy. The truth eventually wins out. In the meantime, we need to keep repeating it to ourselves and each other so we can keep going. Imagine how much worse it would get if we conceded every space, digital and otherwise, to all the lies out there. I don't want to live in a world where truth hides under the bed.

Do you?

There's one ray of hope in these studies. When people believe they can change their behavior, they're more willing to admit they're wrong. In the case of public health or climate collapse, that's always possible.

It doesn't matter how doomed we are. We can always make the doom worse through denial and wishful thinking.

Most of us aren't looking to punish someone for being wrong. We just want them to change their behavior.

We would celebrate it.


If you appreciate my work, you can sign up below. You can also get my book, Doomer, as an ebook, paperback, or hardback.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to The Sentinel-Intelligence.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.