
Think Like The Enemy
12 minHow to Succeed by Thinking Like the Enemy
Golden Hook & Introduction
SECTION
Joe: Alright Lewis, I'm going to say a phrase, and you tell me the first thing that comes to mind. Ready? "Red Team." Lewis: Okay... sounds like the bad guys in a laser tag game, or maybe the B-squad in gym class. Definitely not the winners. Joe: Perfect. That's exactly what most people think. And it's why today's book is so counter-intuitive. It argues the 'Red Team' is the most important one in the room. Lewis: The losing team is the most important? I'm intrigued. Joe: Today we’re diving into Red Team: How to Succeed By Thinking Like the Enemy by Micah Zenko. Lewis: And this isn't some armchair theorist. Zenko is a senior fellow at the Council on Foreign Relations. He's talked to the actual people inside the CIA, the military, and Fortune 500 companies who do this for a living. He’s seen how the sausage gets made, or in some cases, how it spectacularly fails to get made. Joe: Exactly. And the core idea of red teaming, of thinking like the enemy, isn't some new-age business fad. It's an ancient practice. Have you ever heard of the "Devil's Advocate"? Lewis: Of course. It’s that person in a meeting who shoots down every idea just for the sake of it. The professional pessimist. Joe: That's the modern, watered-down version. But the original role was far more serious, and it came from the Vatican.
The Devil's Advocate Principle: Why You Can't Grade Your Own Homework
SECTION
Lewis: The Vatican? What does the Pope have to do with this? Joe: Everything. In the early days of the Catholic Church, making someone a saint was a messy, decentralized process. A local community would just decide, "Hey, Bob was a great guy, let's make him a saint!" There was no quality control. Lewis: So you could have Saint Bob of the Local Pub. I like it. Joe: Pretty much. The Church realized this was a problem for its central authority. So, in the 13th century, they formalized the process. They created a court to review candidates for sainthood. And to ensure rigor, they created a formal, paid position: the Advocatus Diaboli. The Devil's Advocate. Lewis: A paid position to argue against making someone a saint? That's a wild job description. Joe: His entire job was to be the opposition. To take all the evidence of a candidate's miracles and virtues and systematically tear it apart. He was the designated skeptic, the official red team. The goal was to ensure that only the most deserving candidates made it through. Lewis: Wow, so the Church basically invented corporate oversight. But what happened to it? Joe: It worked for centuries. But in 1983, Pope John Paul II eliminated the office to streamline the process. And the result? Critics called it a "saint factory." The number of canonizations skyrocketed, and some argued the process was devalued. An "inflation produced a devaluation," as one historian put it. Lewis: They removed the one person whose job was to say "Are we sure about this?" and things went off the rails. That sounds... familiar. Joe: It’s the perfect illustration of the book's central theme: you cannot grade your own homework. We are all, as individuals and as institutions, terrible at seeing our own flaws. We rationalize, we get confirmation bias, we fall into groupthink. Lewis: Okay, I get the concept. But give me a modern, real-world example. Something outside of saints and popes. Joe: How about General Motors? For a decade, they knew the ignition switch in the Chevy Cobalt was faulty. It could shut the engine off while driving, disabling the airbags, power steering, everything. Lewis: That's terrifying. Joe: It was. And internally, they knew. But instead of fixing it, they started a linguistic campaign. They trained employees not to use words like "defect" or "safety issue." Instead, a defect became "does not perform to design." A safety problem became "has potential safety implications." Lewis: They were red teaming their own language to avoid liability. They were thinking like the enemy, but the enemy was their own legal department. Joe: Precisely. They were grading their own homework and giving themselves an A, right up until the point where 119 people were dead and the company was facing hundreds of millions in compensation. They failed to have an independent voice, a real red team, to say, "Stop talking about the wording and fix the damn car." Lewis: But that sounds like pure corporate evil, not just a blind spot. Is it really about not seeing the problem, or just not wanting to see it because it’ll cost money? Joe: It's both. And that's the point. The biases, whether they're cognitive or financial, are always there. A red team's job is to be the external force that makes it impossible to ignore the truth, no matter how inconvenient or expensive that truth is. It’s the institutional version of that one brutally honest friend who tells you your new haircut is terrible. You might not like hearing it, but you need them.
In the Arena: When Red Teams Win and When They're Sabotaged
SECTION
Lewis: Okay, so I see the why. But this sounds like a recipe for getting fired. Does it ever actually work in really high-stakes environments, like the military? I mean, who's going to tell a four-star general their plan is bad? Joe: It's funny you ask, because the military provides us with both the most spectacular failure and the most stunning success of red teaming. Let's start with the failure, a war game called Millennium Challenge 2002. Lewis: Sounds like a video game. Joe: It cost $250 million, so a bit more than your average PlayStation title. The goal was to test the US military's new high-tech, network-centric way of fighting. They had the "Blue Team," representing the US, and the "Red Team," playing the role of a rogue Middle Eastern state. Lewis: And they hired someone to lead this Red Team? Joe: They hired a retired Marine Lieutenant General named Paul Van Riper. A legendary, old-school, unconventional thinker. They told him, "Your job is to think like the enemy. This is free play. You have the ability to win." Lewis: So what did he do? Joe: He did exactly what they asked. He knew the Blue Team would be listening to all his electronic communications, so he didn't use any. He used motorcycle messengers to deliver orders, just like in World War I. He used a fleet of small civilian boats and planes to track the US naval fleet. And on the second day of the exercise, he launched a massive, coordinated surprise attack. Lewis: And how did that go? Joe: He sank most of the US fleet. In the simulation, sixteen warships were at the bottom of the virtual Persian Gulf. The equivalent of 20,000 US service members were dead. The whole thing was over in about ten minutes. It was, as the book says, a "significant butt-kicking." Lewis: Hold on. They hired him to think like the enemy, he did, he won, and they basically hit the reset button and told him he was cheating? That's insane! Joe: It's the perfect example of institutional ego. The war game wasn't designed to find flaws; it was designed to prove the new concept worked. Van Riper's success was an inconvenient truth. He quit in protest, and the exercise was "scripted" to a US victory. It was a sham. Lewis: That's incredibly depressing. It feels like the whole idea is doomed to fail if the people in charge don't actually want to hear the bad news. Joe: It can be. But now let's look at when it goes right. Fast forward to 2011. The hunt for Osama bin Laden. The CIA finds a compound in Abbottabad, Pakistan, and they think a "high-value target" is inside. But the evidence is all circumstantial. There's no photo, no voice recording. Nothing definitive. Lewis: So it's a huge gamble. Joe: A massive one. The CIA director, Leon Panetta, told President Obama the confidence level was maybe 60-40. Some analysts put it as low as 40%. A coin flip. So what did they do? They red-teamed it. And red-teamed it again. Lewis: How? Did they bring in someone like Van Riper to argue against it? Joe: Even better. They created multiple, independent red teams with opposing goals. One team of analysts was tasked with building the strongest possible case that bin Laden was in the compound. A completely separate team was tasked with building the strongest case that he wasn't there—that it was a decoy, or a different terrorist, or just a rich, eccentric recluse. Lewis: Ah, so instead of one red team versus the 'main' team, they had red teams fighting each other. That removes the ego. It's not about one person being right or wrong; it's about stress-testing the argument from every possible angle to get to the best answer. Joe: Exactly. They were forcing themselves to confront every doubt, every alternative explanation. It didn't make the decision easy, but it gave President Obama the highest possible confidence in the intelligence he had. It allowed him to understand the full spectrum of risk and possibility. And that's red teaming at its absolute best.
The Rules of Engagement: How to Red Team Without Starting a Civil War
SECTION
Lewis: That makes so much sense. The difference between the war game and the bin Laden raid is night and day. So what's the secret? How do you create a system that works like the bin Laden raid and not the Millennium Challenge disaster? Joe: Precisely. And that's what separates a successful red team from a disaster. Zenko lays out some best practices, almost like commandments for this stuff. And the first and most important one is: "The Boss Must Buy In." Lewis: The leadership has to actually want to hear the truth. Joe: They have to be willing to hear the bad news and act on it. And the most tragic example of what happens when they don't is the story of the pre-9/11 FAA red team. Lewis: Oh no. This sounds bad. Joe: It's heartbreaking. For years before the 9/11 attacks, the FAA had a small, dedicated red team. Their job was to test airport security. They would try to smuggle fake bombs, knives, and guns past security checkpoints. Lewis: And I'm guessing they were successful? Joe: Horrifyingly so. They had a success rate of over 90% at major US airports. They documented everything, wrote detailed reports, and sent them up the chain of command. They were screaming that the system was broken. One of the red team leaders, Bogdan Dzakovic, later testified that the system wasn't a failure, it was "designed for failure." Lewis: And what did the FAA leadership do? Joe: They ignored it. They buried the reports. They told the red team to tone it down. In some cases, they even tipped off airports that a test was coming, so the airport could pass. The bosses didn't buy in. They didn't want to hear it. And we all know how that story ends. Lewis: That's just devastating. They were literally screaming that the house was on fire, and no one listened. It makes you wonder about the people on these teams. Who are these people? Zenko calls them 'fearless skeptics with finesse.' They sound like a nightmare to manage. Joe: They can be! He says red teamers are often "misfit toys." They're loners, mavericks, sometimes a bit arrogant. They're the people who are wired to question authority and see the world differently. That's their superpower. Lewis: So you need to find people who are willing to be unpopular. Joe: Exactly. There's a great quote in the book from a global health expert who compares the best red teamers to the ronin samurai in feudal Japan. Because they had no master, they were, in his words, "free to tell the shogun that he was an idiot." Lewis: I love that. You need someone who is loyal to the truth, not to the hierarchy. Someone who is willing to be the skunk at the garden party because they know it's the only way to keep everyone safe.
Synthesis & Takeaways
SECTION
Joe: That's the perfect way to put it. Loyalty to the truth. Lewis: So, after all this, what's the one thing we should take away? Is this just a lesson for huge organizations like the CIA and GM? Joe: No, the deep insight here is that red teaming isn't just a process, it's a form of intellectual humility. It's the discipline of asking, "How could I be wrong?" whether you're a CEO, a general, or just planning a family vacation. The most dangerous blind spot is the one you don't know you have. Lewis: It's about actively seeking out the flaws in your own thinking, rather than just waiting for reality to point them out for you, usually in the most painful way possible. Joe: Exactly. It's about having the courage to invite the Devil's Advocate to the table, to listen to what they have to say, and to be willing to change your mind. Lewis: It makes you wonder, what's the one assumption in your own life or work that you've never really challenged? Joe: That's a great question. And a slightly terrifying one. Lewis: It is! We'd love to hear your thoughts. Find us on our socials and let us know. What's your biggest unchallenged assumption? It could be big or small. Joe: This is Aibrary, signing off.