We don’t see things as they are. We see things as we are.” – Anais Nin

We’ve all been there. Sometimes you recognize it in conversation, sometimes during a meeting or briefing, or still other times as a (not so) casual observer. Cognitive bias is everywhere around us, and it impacts us on a daily basis. Even when we can’t pinpoint the type of bias we see, we recognize it for what it is—a systemic obstacle to clear thinking and decision-making. Custer’s cognitive biases drove him to lead the 7th Cavalry into the Little Big Horn. Montgomery’s biases set the foundation for Operation Market Garden. Johnson’s biases propelled us deeper and deeper into Vietnam in a vain attempt to halt the spread of communism.

Preconceived Notion or Bias?

We all suffer from biases. Sometimes bias is conscious, and we’re well aware that it influences our thinking; other times it’s unconscious, and we’re not at all cognizant of its impact. These biases shape our opinions, our beliefs, our prejudices. As often as not, our biases are not based on reason or experience, but the opposite. You hear something that triggers a bias, and it angers you. You see someone who violates a preconceived notion, and it annoys you. Someone or something infringes upon your own subjective reality and your natural inclination is to recoil.

Cognitive Bias: the List is Long

There is a laundry list of cognitive biases. Far more than I want to squeeze into a short discussion. I have my share, and I’m fairly well aware of them. I’m often guilty of stereotyping, especially when I see a pilot (“That dude is just waiting to tell me that he’s a pilot.”). I avoid casinos like the plague because I suffer from gambler’s fallacy (the more I lose, the surer I am I’ll win the next time). Fortunately, no one has ever accused me of blind spot bias. Knowing is half the battle, I guess.

Of all the biases we encounter, some are more prevalent—and dangerous—than others. Some are more recognizable, and some seem to plague every organization. And some tend to exist around every conference room table you’ve ever seen. A few weeks ago, I wrote at length about the Dunning-Kruger effect – the supremely confident idiot who’s entirely unaware of his own cognitive challenges. This is a classic example of “the less you know, the more confident you are.” In the national security community, authority bias is disturbingly common. Congress has succumbed to this every time someone has briefed them on a new strategy for Afghanistan. Part of this is probably due to the halo effect. Congress has a clear bias for some senior leaders, so much so that they often lack the capacity for critical thought when they’re basking in the glow of someone’s halo.

The Damage of Bias on Senior Leadership Decisions

When it comes to senior leaders, groupthink is another bias that’s routinely noted. An effective leader knows to surround herself with alternative perspectives so as not to fall prey to groupthink. In a world where groupthink often prevails, one of the most reassuring statements I ever heard was, “Don’t agree with me because I’m a four-star. Disagree with me because you think I’m wrong.” It’s also not uncommon to see leaders fall prey to confirmation bias—favoring information that affirms our pre-existing perceptions. The more you hear that Multi-Domain Operations is a novel concept, the less you want to hear some retiree talk about AirLand Battle. Similarly, risk averse leaders uncomfortable with uncertainty will lean heavily on the ambiguity effect, exhibiting a tendency to avoid options where the probably of a favorable outcome is unknown. Such leaders will typically devolve into “paralysis by analysis” and delay making decisions until opportunity, like Elvis, has left the building.

Staff Struggles From Bias Too

Cognitive biases aren’t reserved only for leaders, however. It’s common to encounter support staff who fell prey to anchoring—the tendency to rely too heavily on the first piece of information received. “Never believe the first thing you hear” was a principle of Colin Powell’s, but it doesn’t seem to stop people from seizing on the first report and running with it. Concision bias is also relatively common among staffs, where nuance is sacrificed in favor of what can be more easily—and succinctly—explained. That nuance is often critical to effective decision making and not something that should be ignored in the name of expedience. Another favorite bias typical for staffs is automation bias. The output provided by an information system is only as reliable as the person(s) performing the input, yet many will blindly assume the infallibility of that output. If autocorrect was always right, we wouldn’t have an entire genre of memes dedicated to bad text messages. Staffs also have a habit of descending into in-group favoritism, a  form of cognitive bias capture in the “us versus them” victimology that plagues coordination and relationship building efforts.

The Social Media Minefield

But no place is cognitive bias more visible than on social media. The laundry list of biases exhibited on any given day could consume a behavioral science researcher. The herd behavior common to those platforms is typically manifested in the bandwagon effect—the tendency to do (or believe) something because others do the same—and the availability cascade, where arguments become more plausible the more often they’re repeated. Even if they don’t make any sense. The older crowd generally live in a world where the past is romanticized through declinism and change is resisted via status quo bias. Okay, Boomer, I’m sure Ranger School was a lot tougher before they started letting girls in. Hostile attribution bias—where someone wrongly assumes hostile intent in behaviors—is the source of much of our outrage culture and a hallmark of the social media age. In the same vein, reactive devaluation—rejecting an idea simply because it originates with a supposed rival—fuels dissonance on some platforms.

Avoid Conflict and Discontent

Ultimately, we have an annoying tendency to see what we choose to see rather than what’s actually there. Our biases—conscious or otherwise—skew our perspective and spur much of the conflict and discontent in our lives. The ability to control our biases rests solely with ourselves, and our own self-awareness—or lack of it—is our first line of defense.

Related News

Steve Leonard is a former senior military strategist and the creative force behind the defense microblog, Doctrine Man!!. A career writer and speaker with a passion for developing and mentoring the next generation of thought leaders, he is a co-founder and emeritus board member of the Military Writers Guild; the co-founder of the national security blog, Divergent Options; a member of the editorial review board of the Arthur D. Simons Center’s Interagency Journal; a member of the editorial advisory panel of Military Strategy Magazine; and an emeritus senior fellow at the Modern War Institute at West Point. He is the author, co-author, or editor of several books and is a prolific military cartoonist.