Aibrary Logo
Podcast thumbnail

Failure is Data, Not Doom

15 min

The Surprising Truth About Success

Golden Hook & Introduction

SECTION

Michelle: Okay, Mark. "Black Box Thinking." Five words. Go. Mark: Hmm. Failure is data, not doom. Michelle: Ooh, that's good. Mine is: Your ego is the enemy. Mark: I feel personally attacked. And also intrigued. Let's get into it. Michelle: We are diving into Black Box Thinking: Why Some People Never Learn from Their Mistakes But Some Do by Matthew Syed. And what's fascinating about Syed is that he's not your typical business guru or psychologist. He was an Olympic table tennis player, a three-time Commonwealth champion who lived and breathed high performance. Mark: Right, so he’s seen success and failure at the absolute highest level, up close and personal. Michelle: Exactly. He brings that athlete's mindset—the one that's obsessed with analyzing every tiny mistake to get an edge—and applies it to the worlds of business, healthcare, and our everyday lives. The book was widely acclaimed when it came out for this fresh perspective. It argues that our entire approach to failure is fundamentally broken. Mark: Which brings us back to "your ego is the enemy." That sounds like a pretty direct accusation. Where does he start building that case? Michelle: He starts with a story that is absolutely gut-wrenching. It’s about a woman named Elaine Bromiley.

The Tale of Two Systems: Open vs. Closed Loops

SECTION

Mark: Okay, I'm bracing myself. Michelle: Elaine was a healthy 37-year-old mother of two. She went into the hospital for a routine sinus operation. A very common, low-risk procedure. She kissed her husband and kids goodbye, expecting to be home that evening. She never came home. Mark: Oh no. From a routine operation? What happened? Michelle: On the operating table, the anesthesiologists ran into a problem. They couldn't get the breathing tube into her airway. This is a known, though rare, complication. It's called a "can't intubate, can't ventilate" scenario. Every anesthesiologist is trained for it. There's a clear protocol. Mark: So there was a plan for this exact situation? Michelle: A crystal-clear plan. When you can't get air in through the mouth, you have to perform a simple surgical incision in the neck to create a new airway. A tracheotomy. It takes minutes. But in that operating room, the two experienced doctors became fixated. They kept trying to force the tube down her throat, again and again, convinced they could solve it the way they always had. Mark: Even though the protocol said to do something else? Michelle: Precisely. The nurses in the room could see what was happening. They knew the protocol. One of them even fetched the surgical kit for the tracheotomy and held it out, trying to prompt the doctors. But the hierarchy in that room was so rigid. The doctors were the experts, the authority figures. The nurses felt powerless to challenge them directly. They just hinted. Mark: And the doctors, with their egos wrapped up in being the ones who solve the problem, couldn't see the answer right in front of them. Michelle: They were trapped. Their minds had narrowed. For ten agonizing minutes, they kept trying the same failed method while Elaine's oxygen levels plummeted. By the time they finally abandoned their approach, it was too late. She had suffered catastrophic brain damage from the lack of oxygen and died thirteen days later. Mark: That is just devastating. It’s a failure of skill, but it sounds more like a failure of psychology. And a failure of the system that didn't allow a nurse to just scream, "Stop! Do the surgery!" Michelle: That's the heart of it. Syed calls this a "closed loop." It's a system that, when faced with failure, cannot or will not learn. The information was there—the dropping oxygen levels, the nurse with the kit—but the system was closed to it. The culture of blame and hierarchy meant the error was defended until it was fatal. And the most tragic part? Similar cases had happened before, and they would happen again, because the system wasn't designed to learn. Mark: Okay, that's the "closed loop." It’s a terrifying concept. What does the opposite look like? What’s an "open loop"? Michelle: For that, we have to look to a completely different industry: aviation. In 1978, United Airlines Flight 173 was approaching Portland, Oregon. As they prepared to land, a light indicating a problem with the landing gear came on. The gear seemed to be down, but the light suggested it wasn't locked in place. Mark: So, a similar situation in a way. An unexpected problem in a high-stakes environment. Michelle: Exactly. The captain, a very experienced pilot, decided to circle the airport to troubleshoot. He became completely fixated on this landing gear light. He and the crew were so absorbed in diagnosing this one problem that they failed to notice a much bigger one. Mark: Let me guess. Fuel? Michelle: Fuel. The flight engineer tried to warn the captain multiple times, but his warnings were too subtle, too deferential. He’d say things like, "We're getting low on fuel," but he never directly challenged the captain's authority by saying, "We are going to crash if we do not land now." The plane ran out of fuel and crashed into a suburb, killing ten people. Mark: Wow. It's the exact same dynamic as the hospital. A fixation on one problem, a failure of communication because of hierarchy, and a tragic, preventable outcome. Michelle: On the surface, yes. But here's where the stories diverge completely. The airline industry's reaction was revolutionary. They recovered the plane's "black box"—the flight data and cockpit voice recorders. They didn't hide the failure; they dissected it. They listened to the crew's final moments not to find someone to blame, but to understand the systemic reasons for the crash. Mark: They treated the failure like a treasure trove of data. Michelle: That's the perfect way to put it. And what they learned changed aviation forever. They realized the problem wasn't a bad pilot; it was a bad system of communication. In response, they invented something called Crew Resource Management, or CRM. It was a new training protocol designed to flatten the cockpit hierarchy. It empowered junior crew members to challenge the captain directly and taught captains to actively seek out input. Mark: So they rewired the entire culture of the cockpit to be an open loop. To actively hunt for disconfirming information. Michelle: Yes! They created a system that assumes failure is possible and builds in mechanisms to catch it. Every time there's an incident, the black box is analyzed, and the lessons are shared across the entire industry. That is why flying has become exponentially safer over the last few decades. Healthcare, by contrast, has often remained a closed loop, where errors are hidden due to fear of lawsuits and shame, and the same mistakes are repeated in different hospitals around the world. Mark: The contrast is staggering. One industry uses failure to evolve, the other buries it. And it all comes down to whether you're willing to open the box and look at the ugly truth inside. Michelle: And that willingness, or unwillingness, isn't just about institutions. It's deeply wired into our own psychology.

Hacking Our Brains: From Blame to Growth

SECTION

Mark: Right, because it’s easy to point fingers at doctors or pilots, but I have a feeling this is something we all do. I know I've defended a bad decision long past the point of reason. Michelle: We all have. Syed dives deep into the psychological engine that drives this behavior: cognitive dissonance. It's a term a lot of us have heard, but the book explains its power so clearly. It’s the profound mental discomfort we feel when we hold two contradictory beliefs, or when our actions conflict with our self-image. Mark: Like, "I am a smart person," but also, "I just did a really dumb thing." Michelle: Exactly. And that discomfort is so intense that our brain will do incredible gymnastics to resolve it. But it rarely resolves it by admitting the mistake. Instead, it re-frames the evidence. It tells a new story. Mark: It protects the ego at all costs. Michelle: At all costs. The book gives a chilling example from the justice system: the case of Juan Rivera, a man accused of a horrific crime in 1992. The police interrogated him for days, and eventually, he confessed. The prosecutors built their entire case around this confession. They truly believed, "Juan Rivera is guilty, and we have his confession." Mark: But I'm sensing a 'but' here. Michelle: A huge one. Years later, DNA evidence emerged that completely exonerated him. The DNA found at the scene belonged to someone else entirely. For any rational system, this is case closed. He's innocent. Mark: That makes sense. The evidence is definitive. Michelle: But for the prosecutors, this created an unbearable cognitive dissonance. Their core belief was "Rivera is guilty." The new evidence said, "Rivera is innocent." To accept the DNA would mean accepting that they, the guardians of justice, had sent an innocent man to prison. That they had made a catastrophic, career-defining error. Mark: So what did they do? How does a brain even begin to rationalize that? Michelle: They performed intellectual contortions. They invented new theories on the spot. "Maybe he had an accomplice whose DNA we found!" "Maybe the DNA evidence was contaminated!" They clung to the original confession, re-framing it as the ultimate truth and dismissing the physical evidence as flawed. They fought for years to keep him in prison, even as the evidence of his innocence piled up. Mark: That's terrifying. Their brains were literally rewriting reality to avoid admitting they were wrong. It's the closed loop in a single human mind. Michelle: It is. And while that's an extreme case, we do it all the time. When a project we championed fails, we don't say, "I had a bad idea." We say, "The market wasn't ready," or "My team didn't execute properly." We find an external cause to protect our internal self-image as a competent person. Mark: Okay, so we're all wired for self-deception. That's... a bit depressing. How do we fight back? If aviation can build a system to overcome this, can we? Michelle: Absolutely. And this is where the book becomes incredibly optimistic and practical. The solution is to build systems and habits that redefine failure. One of the most powerful examples is the story of Team Sky, the British professional cycling team. Mark: Oh, I've heard about them. They went from mediocrity to dominating the Tour de France. Michelle: They did. And their secret weapon was a philosophy called "the aggregation of marginal gains." The team's manager, Sir Dave Brailsford, believed that if you could improve every single tiny thing that goes into riding a bike by just 1%, the cumulative effect would be enormous. Mark: So not looking for one giant breakthrough, but a hundred small ones. Michelle: A thousand small ones! They didn't just optimize the obvious things like nutrition and tire weight. They tested which massage gel led to faster muscle recovery. They hired a surgeon to teach the riders how to wash their hands properly to avoid getting sick. They even figured out which pillow and mattress led to the best night's sleep for each rider and brought them to every hotel. Mark: That is an obsessive level of detail. But I see the connection. Each of those is a tiny experiment. If a new pillow doesn't work, it's not a catastrophic failure. It's just a data point. You discard it and try another one. Michelle: You've got it. They created a culture where everything was a test. There was no ego attached to the "old way" of doing things. There was only data. Does this make us faster? Yes? Keep it. No? Discard it. It’s Black Box Thinking on a bicycle. They broke down this massive, daunting goal—"Win the Tour de France"—into hundreds of manageable, testable components, and in doing so, they removed the fear of failure from the process. Mark: That's brilliant for an elite sports team with a huge budget. But how does this translate to a school, or a small business, or just... me, trying not to mess up my life? Michelle: That's the perfect question. And Syed shares a wonderful story that answers it. It's about a headmistress named Heather Hanbury at a high-achieving school in London. She noticed her students were academically brilliant but also terrified of failure. They were so afraid of getting things wrong that they wouldn't take creative risks. Mark: They were optimizing for grades, not for learning. Michelle: Precisely. So she did something radical. She instituted an annual "Failure Week." Mark: A what? A week to celebrate failure? That sounds... counterintuitive. Michelle: It does, but it was genius. During that week, teachers would share their biggest professional blunders. They invited successful parents and entrepreneurs to come in and talk not about their successes, but about the failures that taught them the most. They wanted to destigmatize the very idea of getting things wrong. Mark: What was the message they were trying to send? Michelle: Heather Hanbury put it beautifully. She said, "You’re not born with fear of failure... Very young children have no fear of failure at all. They have great fun trying new things." Her goal was to teach the students how to "fail well"—which means taking a risk, and if it doesn't work, learning from it instead of pretending it never happened or blaming someone else. Mark: That's so powerful. It's giving them the psychological tools to handle setbacks, which is probably more important than any single grade they'll ever get. It's building a personal open loop. Michelle: It's building resilience. It's teaching them that failure isn't the opposite of success; it's a stepping stone on the path to success.

Synthesis & Takeaways

SECTION

Mark: You know, as we talk through this, it seems like the whole book boils down to a simple but profound shift in perspective. It’s about redefining failure. It’s not an indictment of your intelligence or your character. It's just... a data point. An unexpected result. Michelle: That's the perfect summary. The book's big idea is that progress—in science, in business, in our own lives—isn't driven by brilliant, flawless people. It's driven by systems and mindsets that are brilliant at learning from flaws. Mark: Whether that system is a literal black box in a cockpit, a philosophy of marginal gains in a cycling team, or a "Failure Week" in a school, the goal is the same. Michelle: The goal is the same: create a feedback loop. Break the cycle of cognitive dissonance and blame that keeps us trapped in a closed loop. It’s about having the humility to accept that we might be wrong, and the courage to find out how. The book even points to data showing how this mindset impacts entire economies. The US, with a culture that's more forgiving of entrepreneurial failure, has vastly higher rates of new business creation than countries like Japan, where failure carries a heavy social stigma. Mark: So our collective attitude towards failure has real, measurable economic consequences. That's huge. Is there a practical step, a small habit we can build to start thinking this way? Michelle: There is. Syed talks about a technique used in business called a "pre-mortem." It's the opposite of a post-mortem. Before you start a big project, you get your team together and ask them to imagine it's six months in the future, and the project has failed spectacularly. Mark: Oh, I like that. So you're not criticizing a current plan; you're analyzing a hypothetical disaster. Michelle: Exactly. Then, everyone on the team has to write down all the reasons why it failed. Because it's a hypothetical failure, the ego is removed. People are free to bring up potential problems without sounding negative or like they're criticizing their boss. It's a safe, systematic way to find the flaws in your plan before they find you. Mark: That's a fantastic takeaway. It's like a controlled experiment for failure. I'm definitely going to try that. It makes me think about my own life, too. What's a "failure" you've learned the most from? We'd love to hear your stories. Find us on our socials and share. Michelle: I love that question. It’s about turning our past into a resource, not a regret. Mark: This has been incredibly insightful. It's a book that doesn't just give you ideas; it changes how you see the world and yourself. Michelle: This is Aibrary, signing off.

00:00/00:00