How Normalcy Bias Led Boeing to Crash Into Disaster

4 min read
Normalcy Bias

Due to the grounding of its 737 Max airplane following 2 deadly crashes that killed 346 people, Boeing lost $5 billion in direct revenue by the summer of 2019. The overall losses – ranging from damage to the brand to losing customers – were valued by investors at over $25 billion by March 2019. In late 2019, new revelations about problems with the 737 Max further increased Boeing’s losses. In late December, Boeing fired its CEO Dennis Muilenburg due to the 737 Max fiasco.

 

What led to this disaster for Boeing? On the surface, it came from Boeing’s efforts to compete effectively with Airbus’s newer and more fuel-efficient airplane, Airbus 320. To do so, Boeing rushed the 737 Max into production and misled the Federal Aviation Administration (FAA) to get rapid approval for the 737 Max. In the process, Boeing failed to install safety systems that its engineers pushed for and did not address known software bugs in the 737 Max, glitches that resulted in the eventual crashes.

 

The New Normal

 

However, these surface-level issues had a deeper cause. Ironically, the transformation of the airline industry in recent decades to make airplanes much safer and accidents incredibly rare is key to understanding Boeing’s disaster.

 

Boeing’s leadership suffered from what cognitive neuroscientists and behavioral economists know as the normalcy bias. This dangerous judgment error causes our brains to assume things will keep going as they have been – normally. As a result, we drastically underestimate both the likelihood of a disaster occurring and the impact if it does.

 

Boeing’s 737 Max disaster is a classic case of the normalcy bias. The Boeing leadership felt utter confidence in the safety record of the airplanes it produced in the last couple of decades, deservedly so, according to statistics on crashes. From their perspective, it would be impossible to imagine that the 737 Max would be less safe than these other recent-model airplanes. They saw the typical FAA certification process as simply another bureaucratic hassle that got in the way of doing business and competing with Airbus, as opposed to ensuring safety. 

 

Think it’s only big companies? Think again.

 

The normalcy bias is a big reason for bubbles: in stocks, housing prices, loans, and other areas. It’s as though we’re incapable of remembering the previous bubble, even if occurred only a few years ago.

 

Normalcy Bias in a Tech Start-Up

 

Of course, the normalcy bias hits mid-size and small companies hard as well.

 

At one of my frequent trainings for small and mid-size company executives, Brodie, a tech entrepreneur shared the story of a startup he founded with a good friend. They complemented each other well: Brodie had strong technical skills, and his friend brought strong marketing and selling capacity. 

 

Things went great for the first two and a half years, with a growing client list – until his friend got into a bad motorcycle accident that left him unable to talk. Brodie had to deal not only with the emotional trauma, but also with covering his co-founder’s work roles. 

 

Unfortunately, his co-founder failed to keep good notes. He also did not introduce Brodie to his contacts at the client companies. In turn, Brodie – a strong introvert – struggled with selling. Eventually, the startup burned through its cash and had to close its doors. 

 

The normalcy bias is one of many dangerous judgment errors, mental blindspots resulting from how our brains are wired. Researchers in cognitive neuroscience and behavioral economics call them cognitive biases.

 

Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors, in your professional life, your relationships, or other life areas

 

You need to evaluate where cognitive biases are hurting you and others in your team and organization. Then, you can use structured decision-making methods to make “good enough” daily decisions quickly; more thorough ones for moderately important choices; and in-depth ones for truly major decisions.

 

Such techniques will also help you implement your decisions well, and formulate truly effective long-term strategic plans. In addition, you can develop mental habits and skills to notice cognitive biases and prevent yourself from slipping into them.

 

Preventing Normalcy Bias Disasters

 

With the normalcy bias in particular, it really helps to use the strategy of considering and addressing potential alternative futures that are much more negative than you intuitively feel are likely. That’s the strategy that Brodie and I explored in my coaching with him after the training session, as he felt ready to get back to the startup world.

 

While Brodie definitely knew he wouldn’t be up to starting a new business himself, he also wanted to avoid the previous problems. So we discussed how he would from the start push for creating systems and processes that would enable each co-founder to back up the other in cases of emergencies. Moreover, the co-founders would commit to sharing important contacts from their side of the business with each other, so that relationships could be maintained if the other person was out of commission for a while. 

 

So what are the broader principles here? 

 

1) Be much more pessimistic about the possibility and impact of disasters than you intuitively feel or can easily imagine, to get over the challenges caused by the normalcy bias. 

 

2) Use effective strategic planning techniques to scan for potential disasters and try to address them in advance, as Brodie did with his plans for the new business. 

 

3) Of course, you can’t predict everything, so retain some extra capacity in your system – of time, money, and other resources – that you can use to deal with unknown unknowns, also called black swans

 

4) Finally, if you see a hint of a disaster, react much more quickly than you intuitively feel you should to overcome the gut reaction’s dismissal of the likelihood and impact of disasters.

 

Key Takeaway

 

Our brains cause us to drastically underestimate both the likelihood of a disaster occurring and the impact if it does. To address this dangerous judgment error known as the normalcy bias, we need to go far beyond our intuitions in planning for catastrophes. —> Click to tweet

Questions to Consider (please share your thoughts in the comments section)

 

  • Were you ever caught out by an unexpected disaster? What happened?
  • What kind of disasters do people most tend to underestimate, in your experience? 
  • How might you help your team and organization address the normalcy bias? What are some next steps you can take to do so?

Image credit: Pixabay/PIRO4D

 


Bio: Dr. Gleb Tsipursky is on a mission to protect leaders from dangerous judgment errors known as cognitive biases. His expertise and passion is using pragmatic business experience and cutting-edge behavioral economics and cognitive neuroscience to develop the most effective and profitable decision-making strategies. A best-selling author, he wrote Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters (2019), The Truth Seeker’s Handbook: A Science-Based Guide (2017), and The Blindspots Between Us: How to Overcome Unconscious Cognitive Bias and Build Better Relationships (2020). Dr. Tsipursky’s cutting-edge thought leadership was featured in over 400 articles and 350 interviews in Fast Company, CBS News, Time, Business Insider, Government Executive, The Chronicle of Philanthropy, Inc. Magazine, and elsewhere.

 

His expertise comes from over 20 years of consulting, coaching, and speaking and training experience as the CEO of Disaster Avoidance Experts. Its hundreds of clients, mid-size and large companies and nonprofits, span North America, Europe, and Australia, and include Aflac, IBM, Honda, Wells Fargo, and the World Wildlife Fund. His expertise also stems from his research background as a behavioral economist and cognitive neuroscientist with over 15 years in academia, including 7 years as a professor at the Ohio State University. He published dozens of peer-reviewed articles in academic journals such as Behavior and Social Issues and Journal of Social and Political Psychology.

 

He lives in Columbus, OH, and to avoid disaster in his personal life makes sure to spend ample time with his wife. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, follow him on Twitter @gleb_tsipursky, Instagram @dr_gleb_tsipursky, Facebook, YouTube, RSS, and LinkedIn. Most importantly, help yourself avoid disasters and maximize success, and get a free copy of the Assessment on Dangerous Judgment Errors in the Workplace, by signing up for his free Wise Decision Maker Course.