Leadership, Alligator Heads, and the Errors in Thinking that Sabotage our Work

Brad Davis Leadership, Values-Driven Leaders

Look out!

Alligator

Don’t worry, it’s not a real alligator, just a kind-of-scary picture of one.  But…did you jump (it’s okay to admit it if you did)?  If so, then you committed a Type 1 cognition error, also known as a false positive. This is what happens when our brains think they detect something that in reality is not there, and it’s something that happens to us quite often.  Now, for those of you who didn’t jump, what would have happened if there really had been an alligator crawling out of your computer screen to attack you? Well, by not jumping, you would have committed a Type 2 cognition error, also known as a false negative—and (although it is a small gator) it might have cost you your life! Type 2 errors happen when our brains fail to detect something that in reality is there.  Michael Shermer writes about this fascinating topic, which he calls “patternicity,” and observes that “our brains are belief engines: evolved pattern-recognition machines that connect the dots and create meaning out of the patterns that we think we see in nature.”

When we think about the challenges that confront our organizations, we try to avoid the Type 1 error of leaping to unfounded conclusions based on scant information—jumping when it isn’t warranted.  However, we also want to avoid the opposite situation of not jumping when we should have—the infamous (and questionable) boiling frog situation.  In either case, the challenge is to make the right decision, always with incomplete information and often with limited time.  Given that information is always limited, how do we increase our odds of making the right call?  Fortunately, most of the decisions we make in any given day are not life-and-death and do not require split-second response. This means that we usually have at least a little time to think before acting.

One famous approach to this problem is the Observe, Orient, Decide, Act Loop, a.k.a., the  OODA Loop, and, without getting into the finer details of this well-known model, I’ll simply observe that Type 1 and 2 errors occur in the “D” phase of the loop—when we actually decide on a course of action. Avoiding these errors is the job of the aptly named “OO” phases—when we are formulating our conception of what’s happening and figuring out what options are open to us. Errors that occur in the “Action” phase are errors of execution, not decision, so if you make the right call to leap away from an alligator, but fall down instead of jumping, there’s not much this blog can do to help you.

Avoid Common Errors in Thinking

So, the key to avoiding Type 1 and 2 errors is to accurately observe and orient to your environment and your situation.  The aviation community refers to this as situation awareness, or SA, and it’s critically important to safely and effectively flying an aircraft.  SA is a fully developed field of theory and research (see Micah Endsley’s article) and has been incorporated into other practical applications such as surgical rooms and rescue operations.  As important as SA is to pilots, doctors, and first responders, it’s no less important to you; and there’s no better way to increase your SA than through effective teamwork. For a historical example of this truism, Diomede, one of the heroes of Greek legend from The Iliad, famously observed, “When two men are together, one of them may see some opportunity which the other has not caught sight of; if a man is alone he is less full of resource, and his wit is weaker.” Maverick, the more recent fictional hero of the movie Top Gun, learned the same thing, “Never, never leave you wing man!”

It’s no mystery that teamwork is essential to operating effectively, however there is a bias in our organizations that only certain people with specific skills, credentials, and positions are empowered to provide observations that inform decisions. These might be forecasters, marketers, economists, strategists, and, of course, members of the C-suite. This bias is implicit in the fact that we call it “teamwork” rather than “teamlook;” as if the team exists only to do the “Act” part of the OODA. In some cases this can literally be true as some leaders retain all observing, orienting, and deciding to themselves and a small core of trusted advisors—their in-group.  By doing this, they basically tell a whole army of potential wingmen (a.k.a. “employees”) that what they see isn’t important and that what they think about the future doesn’t matter. Intuition is probably sufficient to provide the linkage between this attitude and the widely-reported worker disengagement that persists year-after-year.

Excluding potentially valuable sources of information from our deliberations is no way to fly an aircraft, and it’s also a terrible way to run an organization. When we exclude people we not only lose potentially critical information, but we also lose protection from other cognitive biases that lead to Type 1 and 2 errors. Daniel Kanheman won a Nobel Prize in economics for his work on cognitive biases and the apparently irrational behavior they can cause. He popularized his work in this field with his bestselling book, Thinking, Fast and Slow, where he describes the now-famous concept of System 1 thinking (dependent on rules of thumb, immediate, quick-reaction, prone to bias errors) and System 2 thinking (reflective, critical, deep thinking, typically slower). This phenomenon has been likened to we humans having two brains rather than just one.

Cognitive biases tend to be more prevalent in System 1 thinking, our fast-twitch brain; these biases can be crippling limitations in our struggle to build and maintain the SA upon which our decisions are based. However, there’s hope in that other people can help us overcome our biases by helping us to question our assumptions and our cognitive methods. Not only can our teammates provide us with another pair of hands and eyes, it turns out they can provide us with another pair of brains! Diversity here is important, since people with similar experiences and backgrounds are more likely to share the same biases and fall into the same cognitive traps. The axiom that “perspective is everything” is never truer than in our attempts to build and maintain situational awareness and avoid critical errors. As leaders, we increase our situational awareness, and our likelihood of making good choices, by facing our alligators as a team.

__

Brad DavisBrad Davis is a senior analyst with Engility Corporation, who served 20 years with U.S. Air Force. He is also a student in the Ph.D./D.B.A. Program in Values-Driven Leadership.

Alligator Photo Credit: 8bitmatt via Compfight cc

Previous ArticleNext Article