
Why the Best Teams Fail Most
11 minGolden Hook & Introduction
SECTION
Olivia: A recent study found that in hospitals, the best, most effective teams report the highest number of medical errors. Jackson: Wait, hold on. That sounds completely backward. You’re telling me the A-team, the top-tier unit, is the one making the most mistakes? That can’t be right. Olivia: It feels wrong, doesn't it? But it turns out, the reason why is the secret to succeeding at, well, pretty much everything. That paradox is at the heart of Right Kind of Wrong: The Science of Failing Well by Amy Edmondson. Jackson: And Edmondson is the perfect person to write this. She's not just a Harvard professor; she's the researcher who literally pioneered the concept of 'psychological safety' and was just ranked the number one management thinker in the world. Olivia: Exactly. And her book, which won the Financial Times Business Book of the Year award, argues that our entire approach to failure is broken. She believes we need a completely new vocabulary for it, because we’re lumping everything into one scary, negative bucket. Jackson: A new vocabulary for failure. I’m intrigued. Where do we even start with that? It feels like trying to invent new words for feeling pain. Olivia: We start by realizing that failure isn't one thing. It's a spectrum. And on one end of that spectrum, you have what Edmondson calls a 'basic failure.'
The Failure Spectrum: Why Not All 'Wrong' is Created Equal
SECTION
Jackson: Okay, 'basic failure.' That sounds... well, basic. What does that look like in the real world? Olivia: It looks like a single, preventable mistake in a well-understood process. Think of a chef forgetting a key ingredient in a recipe they’ve made a thousand times. But sometimes, the stakes are a little higher. Edmondson tells the story of a Citibank employee who, through a series of confusing interface clicks, accidentally wired nine hundred million dollars to the wrong people. Jackson: Nine hundred… million? My heart just stopped for that person. That’s not a mistake, that’s a career-ending, company-endangering catastrophe. Olivia: It was a colossal error. But it was fundamentally a basic failure. There was a known process for wiring money, and a deviation from that process occurred. It was preventable. That’s the key. These are the failures we should work hard to eliminate with checklists, better design, and more focus. Jackson: Right, so that’s the kind of wrong we should feel bad about. It’s a slip-up in a zone where we should know better. What’s the next type? Olivia: The next type is a 'complex failure.' This is where things get messy. A complex failure isn't caused by one single mistake. It's the result of a perfect storm of multiple factors coming together in an unpredictable way. Think of it as the Swiss cheese model—each slice of cheese has holes, and a failure only happens when all the holes in all the slices line up perfectly. Jackson: I think I see. So a basic failure is one person messing up a recipe, but a complex one is the whole kitchen catching fire because the fire alarm battery was dead, the sprinkler was faulty, and a grease trap hadn't been cleaned, all at the same time. Olivia: That’s a perfect analogy. Edmondson points to disasters like the Boeing 737 MAX crashes. It wasn't just one engineering flaw. It was a combination of software design, a lack of pilot training on the new system, regulatory pressures, and a corporate culture that may have discouraged engineers from speaking up. No single person was the sole cause; the system itself failed. Jackson: That’s a much scarier kind of failure, because it feels like you can’t control it. It’s a cascade. Olivia: Exactly. And trying to find one person to blame in a complex failure is not only wrong, it's dangerous, because it stops you from fixing the underlying systemic issues. But this brings us to the third type, which is the most radical and exciting idea in the book. Jackson: Let me guess, the 'intelligent failure'? Olivia: You got it. An intelligent failure is what Edmondson calls the 'right kind of wrong.' These are the failures that are not only acceptable, but desirable.
Intelligent Failure: The Art of Failing on Purpose for Progress
SECTION
Jackson: Desirable failure. That feels so counterintuitive. We are all taught from childhood to avoid failure at all costs. How can a failure possibly be good? Olivia: It’s good when it happens in new territory. An intelligent failure has four key attributes. First, it takes place in an uncharted area where you can't know the outcome in advance. Second, it's a credible opportunity to advance toward a goal. Third, it's informed by all available knowledge—you've done your homework. And fourth, it's as small as possible to still give you valuable information. Jackson: Okay, so it’s not just failing randomly. It’s a calculated experiment. A hypothesis. Olivia: Precisely. Think of scientific research. Edmondson profiles a chemistry lab at Emory University where the lead scientist, Dr. Jennifer Heemstra, tells her students, "We're going to fail all day." She estimates that 95% of their experiments don't work as planned. But each 'failure' provides new information. It tells them one more way that doesn't work, which gets them closer to the way that does. Jackson: So in that context, a failed experiment isn't a mistake, it's just... data. Olivia: It's the highest-octane fuel for discovery! The most famous example of this is the story of the Post-it Note at 3M. In 1968, a scientist named Spencer Silver was trying to create a super-strong adhesive for building airplanes. And he failed. Miserably. Instead, he accidentally created the opposite: a ridiculously weak adhesive that was barely sticky. Jackson: A total failure. He had one job—make strong glue—and he made weak glue. Olivia: By all metrics, it was a basic failure of his goal. He could have thrown it out and been ashamed. But he was curious. He didn't know what it was for, but he knew it was strange. For years, he just went around 3M showing his 'solution without a problem' to anyone who would listen. The system at 3M allowed for this kind of curiosity. Jackson: And let me guess, someone else saw a use for it? Olivia: Years later! Another 3M scientist, Art Fry, was singing in his church choir and was getting annoyed that the paper bookmarks in his hymnal kept falling out. He suddenly remembered his colleague's weird, weak adhesive. What if he could use it to create a bookmark that would stick lightly to the page without damaging it? He experimented, and the Post-it Note was born from that initial 'failure.' Jackson: Wow. So an intelligent failure isn't just about the experiment itself, it's about having a system that can recognize the value in an unexpected outcome. Olivia: That is the absolute key. And it brings up a really important critique that readers have raised about this book. Jackson: I was wondering about this. This sounds great for 3M or Google's Moonshot Factory, where they literally have an award for "heroic failure." But is this 'right to fail' a privilege that only big, wealthy companies or tenured professors have? What about a small business owner or a freelancer where one failure could be financially fatal? Olivia: It's a fantastic and crucial point. Edmondson acknowledges that the freedom to fail is not distributed equally. And that's exactly why her other huge idea, the one that started her entire career, is so critical. You can't have intelligent failure, or even learn from basic and complex failures, without it.
The Human System: Why We Fail to Fail Well
SECTION
Jackson: And that big idea is psychological safety, right? The concept she's famous for. Olivia: It is. And the story of how she discovered it is the perfect illustration of everything we've been talking about. It goes right back to that hospital study I mentioned at the beginning. Jackson: The one where the best teams were making the most mistakes. I still can't wrap my head around that. Olivia: Well, neither could she! She was a young PhD student, and this was her big project. Her hypothesis was simple and obvious: better teamwork on a hospital ward would lead to fewer medication errors. She collected the data, ran the stats, and the result came back not just wrong, but the complete opposite of what she predicted. The correlation was strong and positive: better teams had higher error rates. Jackson: I would have panicked. I'd think my entire study was garbage. Olivia: She did! She wrote in her journal, "I was not just wrong. I was completely wrong." She thought she wasn't cut out for a PhD. But then, after the initial shock, she got curious. She paused and asked a different question. What if the data wasn't measuring the rate of making errors? What if it was measuring the rate of reporting errors? Jackson: Whoa. So the data wasn't measuring mistakes, it was measuring trust. Olivia: Exactly! The 'better' teams weren't making more mistakes; they were the teams where a nurse felt safe enough to raise their hand and say, "Hey, I think I just gave the wrong dose," or "Can someone double-check this with me?" without fearing they'd be shamed, blamed, or fired. The teams with lower reported errors weren't more perfect; they were more afraid. They were hiding their mistakes. Jackson: That is a chilling thought in a hospital setting. And it gives me chills just thinking about it. It completely reframes the problem. The goal isn't to have zero errors reported. The goal is to have 100% of errors reported so the system can learn. Olivia: That is the essence of psychological safety. It’s a shared belief that the team is safe for interpersonal risk-taking. It’s the foundation for learning from all types of failure. Without it, basic failures get hidden until they become catastrophes. Complex failures are never investigated because everyone is just pointing fingers. And intelligent failures? They never even get started, because who would risk trying something new if they'll be punished for it not working out? Jackson: It connects everything. You can't have the 'right kind of wrong' if you don't have the right kind of culture. It explains why some organizations, like those that built the 737 MAX, can have so many smart people and still head toward disaster—because the warnings get silenced by fear. Olivia: Precisely. The system was designed for silence, not for learning.
Synthesis & Takeaways
SECTION
Jackson: So when you boil it all down, it seems like the problem isn't failure itself. The real enemy is our fear of failure. Olivia: That's the core of it. And Edmondson's work shows this isn't a soft, 'feel-good' idea. It's a hard, operational necessity. From preventing plane crashes to creating the next billion-dollar product, the common denominator is creating systems where it's safe to be human and fallible. Jackson: It's about building a system that assumes we will fail, and then optimizes for learning and recovery, rather than assuming we should be perfect and then punishing us when we're not. Olivia: Perfectly put. It’s a fundamental shift from a performance mindset—'I must look good and be right'—to a learning mindset—'I must grow and get better.' Jackson: So the one thing our listeners could do this week is maybe reframe one small 'failure.' Instead of just feeling bad, ask: Was this a basic, complex, or intelligent failure? What can I actually learn from this? Olivia: I love that. A tiny shift in perspective. And we'd love to hear how it goes. Share your stories of 'failing well' with the Aibrary community on our social channels. It’s a conversation worth having. Jackson: This is Aibrary, signing off.