“The facts, while interesting, are irrelevant.” – Unknown
It should have been an easy argument. Our case was supported by indisputable facts, logically sound, and seemingly unassailable. Yet it was met with moderate hostility and dead on arrival when presented. But it also provided lessons that still resonate with me today.
The argument was prompted by my division commander, who was pushing to accelerate the deployment timeline for Army conventional forces arriving in Afghanistan in the months following the 9/11 attacks. Frustrated with what he saw as a ploddingly slow pace, he tasked us with finding a faster way to achieve a fully operational capability on the ground sooner. After a weekend of modeling options with an intrepid transportation officer, we identified a course of action that would leverage seaborne transport, port facilities at a Gulf partner nation, and a shorter air bridge that would ultimately cut the timeline in half. In half. The facts proved it.
We put together a classified brief and presented it to the logistics team responsible for planning the entire operation. The colonel who led the team took one look at the alternative we presented, muttered something about the unavailability of ports in Pakistan (which weren’t part of our plan), dug his heels in, and shot a figurative hole in our slide deck. Our plan was wrong. End of discussion. The facts didn’t matter.
WHEN THE FACTS DON’T MATTER
When it comes to arguing with facts and reason, most of us subscribe to rational choice theory, even if we don’t actually know what it is or from where it originated. Scottish economist Adam Smith first posited the theory in his 1776 timeless classic, The Wealth of Nations. Smith believed that individuals will use rational calculations to make rational choices that align with their own rational beliefs and objects. When presented with reason, the rational individual will make rational choices.
It all makes perfect sense. Until it doesn’t.
Jen Dalton, the author of Listen: How to Embrace the Difficult Conversations Life Throws at You, explains why: “Humans are not rational so much as emotional. We tend to rationally justify our emotions and responses.” If someone’s ego is too closely aligned with their position, facts and reason mean little. Challenging that position only solicits an emotional response and rarely a good one: “You have no idea what you’re talking about.” Assuming the response is even that polite or free from obscenity.
This phenomenon is called the backfire effect, and it plays a significant role in how we form and reinforce our beliefs on any variety of issues. The backfire effect is a subtype of confirmation bias; it’s “a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence” and often drives them to further reinforce their original stance. The more you argue the facts, the more ineffective your argument becomes. Eventually, it backfires altogether, and you’ve only succeeded in further strengthening someone’s emotional bond to a faulty belief.
Ever argue religion with someone? Bring up politics with co-workers? Have you dared to try and argue the efficacy of vaccinations? The need for expanding immigration? Welfare? Universal access to healthcare. Those are all areas where the emotional bond to beliefs are so strong that any attempt to challenge them elicits the backfire effect.
Kind of like someone’s not-so-brilliant war plan.
WHAT TO DO ABOUT IT
Like with many forms of cognitive bias, the backfire effect is rooted in misinformation. Flawed beliefs and perceptions typically emerge as a result of people being deeply misinformed on specific topics. They’re emotionally tied to their “facts” and won’t tolerate them being challenged. Challenging those facts stimulates the amygdala, the brain’s emotional processor; a perceived attack pushes the amygdala into overdrive, essentially emotionally hijacking the debate.
Dalton recommends engaging with other parts of the brain, the areas that “embrace collaboration and creativity.” In their book, The Enigma of Reason, cognitive scientists Hugo Mercier and Dan Sperber argue that reason didn’t develop to allow us to solve problems or draw conclusions from data. Instead, we developed reason to help us “resolve the problems posed by living in collaborative groups.” In other words, we like to be part of the solution, not part of the problem. If an argument makes us part of the problem, we’ll withdraw and close off.
Dalton continues: “Forget about trying to win an argument; if that’s your goal, you’ve already lost.” Instead, use influence and collaboration as your principal tools. Present the issue as something to work together, which offers an opportunity to work collaboratively, move forward, and learn. Ask questions. Guide the process with a little emotional intelligence. Cede some control to the other party, enabling them to feel like they are approaching from an equal footing. Most importantly, do it together.
I’ve used this process with a fair degree of success to present arguments on—wait for it—social media, driving positive discussions where people could learn and grow together. It doesn’t always work, so don’t be disappointed when you’re not successful at changing someone’s political or religious beliefs. Some issues are just a bridge too far. Like war plans. When someone thinks they’re the reincarnation of Clausewitz, you’ll never change their mind.