Your rating:
Nobody wants to fail. But in highly complex organizations, success can happen only when we confront our mistakes, learn from our own version of a black box, and create a climate where it’s safe to fail. We all have to endure failure from time to time, whether it’s underperforming at a job interview, flunking an exam, or losing a pickup basketball game. But for people working in safety-critical industries, getting it wrong can have deadly consequences. Consider the shocking fact that preventable medical error is the third-biggest killer in the United States, causing more than 400,000 deaths every year. More people die from mistakes made by doctors and hospitals than from traffic accidents. And most of those mistakes are never made public, because of malpractice settlements with nondisclosure clauses. For a dramatically different approach to failure, look at aviation. Every passenger aircraft in the world is equipped with an almost indestructible black box. Whenever there’s any sort of mishap, major or minor, the box is opened, the data is analyzed, and experts figure out exactly what went wrong. Then the facts are published and procedures are changed, so that the same mistakes won’t happen again. By applying this method in recent decades, the industry has created an astonishingly good safety record. Few of us put lives at risk in our daily work as surgeons and pilots do, but we all have a strong interest in avoiding predictable and preventable errors. So why don’t we all embrace the aviation approach to failure rather than the health-care approach? As Matthew Syed shows in this eye-opening book, the answer is rooted in human psychology and organizational culture. Syed argues that the most important determinant of success in any field is an acknowledgment of failure and a willingness to engage with it. Yet most of us are stuck in a relationship with failure that impedes progress, halts innovation, and damages our careers and personal lives. We rarely acknowledge or learn from failure—even though we often claim the opposite. We think we have 20/20 hindsight, but our vision is usually fuzzy. Syed draws on a wide range of sources—from anthropology and psychology to history and complexity theory—to explore the subtle but predictable patterns of human error and our defensive responses to error. He also shares fascinating stories of individuals and organizations that have successfully embraced a black box approach to improvement, such as David Beckham, the Mercedes F1 team, and Dropbox.
Publication Year: 2015
No posts yet
Kick off the convo with a theory, question, musing, or update
Your rating:
In "Black Box Thinking," author Matthew Syed advocates for broad adoption of professional aviation's approach to error. As a professional aviator, his thesis struck me as so obvious that I felt like I was stuck in a conversation with someone trying to convert me to a religion I already practice.
Here's the short version of his thesis: in many walks of life, it's normal to handwave mistakes as anomalies, "just one of those things," or the embarrassing errors of incompetents. In aviation, we cultivate a culture of recognizing and owning errors, then sharing lessons learned. The idea is that, assuming a baseline level of competence, any pilot is capable of making the same mistake(s) as any other pilot. By embracing knowledge sharing and professional honesty, we make aviation safer for everyone. Syed thinks this is great, and thinks that other professions and professionals have much to learn from this practice.
As I wrote, I agree with him. I also think I need to look around at my own life and ask, "What other ideas do I take for granted that I could turn into a book?" Maybe there's a goldmine in something entitled, "Try Not To Be a Dick, and Other Life Lessons," or maybe "Eat Your Vegetables and Go Easy on the Alcohol." How hard could this be?