
Red Team
12 minHow to Succeed By Thinking Like the Enemy
Introduction
Narrator: In 1587, the Roman Catholic Church formalized a role that had existed for centuries: the Advocatus Diaboli, or the Devil's Advocate. For nearly 400 years, whenever a candidate was proposed for sainthood, it was this person's job to argue against them. They were a designated dissenter, tasked with rigorously challenging the evidence, questioning the miracles, and finding every possible flaw in the case. Their purpose was not to be obstructive, but to ensure the integrity of the process. Then, in 1983, Pope John Paul II eliminated the position to streamline canonizations. The result? A critic noted that the subsequent "inflation produced a devaluation," as the church became a veritable "saint factory." By removing the institutionalized skeptic, the system lost its most vital check against bias. This historical anecdote reveals a timeless organizational vulnerability: the danger of unchecked consensus. In his book, Red Team: How to Succeed By Thinking Like the Enemy, author Micah Zenko argues that this same vulnerability plagues modern institutions, from corporations to intelligence agencies. He reveals that the solution lies in a modern-day Devil's Advocate, a structured process known as red teaming.
You Cannot Grade Your Own Homework
Key Insight 1
Narrator: The central premise of Zenko's work is that organizations are inherently incapable of objectively evaluating their own performance and strategies. This isn't due to a lack of intelligence, but to a web of cognitive and organizational biases that create powerful blind spots. The most dangerous of these is the simple fact that, as Zenko puts it, "you cannot grade your own homework."
A chilling example of this principle is the story of General Motors' faulty ignition switch. For a decade, GM knew that the switch in its Chevrolet Cobalt could shut off the engine while driving, disabling power steering, brakes, and airbags. Yet, no recall was issued. This wasn't a simple oversight; it was a systemic failure rooted in a culture that discouraged bad news. An internal investigation later revealed that employees were formally trained to soften their language in reports, replacing words like "defect" with "does not perform to design" and "safety" with "has potential safety implications." The organization was so focused on avoiding legal liability that it rationalized away a problem that ultimately led to at least 119 deaths, costing the company hundreds of millions and firing fifteen senior managers.
This same dynamic played out within the CIA's post-9/11 detention and interrogation program. The very personnel and contractors running the program—those with a vested interest in its continuation—were the ones tasked with assessing its effectiveness. Unsurprisingly, their internal reviews consistently concluded the program was highly effective and necessary. Requests from National Security Advisor Condoleezza Rice for an independent "red team" analysis were ignored by senior CIA officials. The organization was grading its own homework, and in doing so, it failed to critically assess the ethical and practical failings of its own methods.
The Three Pillars of Red Teaming
Key Insight 2
Narrator: Red teaming is not a single activity but a collection of methods designed to challenge assumptions and identify vulnerabilities. Zenko outlines three core techniques that form the pillars of this practice: simulations, vulnerability probes, and alternative analyses.
Simulations allow an organization to test its plans against a thinking adversary. A powerful example is the planning for the 2011 US Navy SEAL raid on Osama bin Laden's compound. The SEALs didn't just practice the mission; they ran through countless real-life simulations to test every conceivable "what if?" scenario. When one of their helicopters crash-landed inside the compound on the night of the raid—a potential mission-ending catastrophe—the team didn't panic. They had already planned and trained for that exact contingency, allowing the mission to proceed successfully.
Vulnerability probes test an institution's defenses by actively trying to break them. The Government Accountability Office (GAO) has conducted numerous such probes. In one series of tests, undercover investigators successfully smuggled radioactive material across US borders. In another, they smuggled bomb components into ten out of ten targeted federal buildings. These probes aren't meant to embarrass, but to reveal weaknesses in a controlled manner before a real adversary can exploit them.
Alternative analyses challenge the dominant narrative by introducing different perspectives. After 9/11, the CIA's Red Cell was created specifically for this purpose. In 2010, while conventional wisdom focused on terrorism as an external threat, the Red Cell produced a memo with a startling title: "What if Foreigners See the United States as an ‘Exporter of Terrorism’?" This counterintuitive analysis forced policymakers to consider how US actions might be perceived abroad, a perspective that mainline analysis, mired in its own assumptions, was unlikely to produce.
The Six Commandments of Effective Red Teaming
Key Insight 3
Narrator: For red teaming to be effective, it must be more than a token exercise. Zenko identifies six best practices, or "commandments," that are crucial for success. The first and most important is that the boss must buy in. Without top-level support, red teams will be under-resourced, marginalized, or ignored. The tragic story of the pre-9/11 FAA red team serves as a stark warning. For years, this team conducted vulnerability probes and consistently found that they could smuggle fake bombs and weapons past airport security with alarming ease. Their reports, however, were systematically ignored by FAA leadership. The system, as one red teamer later testified, was "designed for failure," a failure that culminated in the attacks of September 11, 2001.
Second, a red team must be outside and objective, while inside and aware. It needs enough independence to avoid "going native" and adopting the organization's biases, but enough institutional knowledge to understand the culture and access information. Global health expert Gregory Pirio compared the best red teamers to the masterless ronin samurai of feudal Japan, who were "free to tell the shogun that he was an idiot."
Third, the team must be staffed with fearless skeptics with finesse. Red teamers are often mavericks and contrarians—one instructor described his unit as "the land of misfit toys." This is their greatest strength, but they must also have the communication skills to deliver their critical findings without being dismissed.
The final three commandments are intertwined: have a big bag of tricks (using diverse methods), be willing to hear bad news and act on it, and red team just enough, but no more. The willingness to act is paramount. A white-hat hacker told Zenko about a Fortune 100 technology firm where his team had identified the same critical network vulnerability for over a decade. Despite being given a clear plan for how to fix it, the firm simply never did. For that company, red teaming was a pointless exercise in generating reports that gathered dust.
The Perils of Prostituted Red Teaming
Key Insight 4
Narrator: Perhaps the greatest danger to red teaming is not that it will be ignored, but that it will be corrupted. An organization can go through the motions of a red team exercise not to challenge its assumptions, but to validate them. The most infamous example of this is the Millennium Challenge 2002 (MC '02), a massive, $250 million war game designed to test the US military's new "transformation" concepts.
The exercise pitted a high-tech US "Blue" force against a "Red" force, representing a rogue Middle Eastern state, led by retired Marine Lieutenant General Paul Van Riper. Van Riper, a brilliant and unconventional thinker, refused to play by the script. He used low-tech methods like motorcycle couriers to avoid electronic surveillance and, in a stunning opening move, launched a massive, preemptive strike that overwhelmed the Blue fleet, sinking 16 warships. In about ten minutes, the Red team had won.
But the exercise controllers couldn't accept this outcome. It didn't validate the concepts they wanted to prove. So, they simply pressed a reset button. The "sunken" ships were re-floated, and Van Riper's forces were hobbled with new restrictions, such as being forbidden from shooting down Blue aircraft. The rest of the exercise became a scripted march to a predetermined Blue victory. Van Riper quit in disgust, later stating that the whole thing was "prostituted; it was a sham intended to prove what they wanted to prove." MC '02 became a case study in how not to red team, demonstrating that a red team without freedom of action is merely a tool for reinforcing an organization's most dangerous blind spots.
Conclusion
Narrator: The single most important takeaway from Red Team is that challenging assumptions is not a natural act for large organizations; it is a discipline that must be cultivated with courage and structure. The default state for any institution is consensus, comfort, and a quiet confidence in its own methods. Red teaming is the necessary antidote to this complacency. It is a formal recognition that the smartest people in the room can still be wrong, especially when they all agree.
Ultimately, the book's challenge extends beyond a set of techniques. It asks a fundamental question of leaders everywhere: Are you willing to hear bad news? The real test of an organization's strength isn't its ability to create a plan, but its willingness to have that plan torn apart by a trusted critic. The future doesn't belong to the organizations that have all the right answers, but to those that have the courage to relentlessly question them.