Aibrary Logo
Podcast thumbnail

The 9-Megaton Wrench

12 min

Golden Hook & Introduction

SECTION

Michael: The United States military has officially lost six nuclear bombs. They are still missing to this day. That’s not the plot of a summer blockbuster; it's a historical fact. And it sets the stage for the story we're telling today—a story about a bomb they didn't lose, but almost used by complete accident. Kevin: Hold on, for real? Lost-lost? As in, they’re just sitting somewhere at the bottom of an ocean or buried in a swamp and we have no idea where? Michael: That's exactly right. They're called "Broken Arrow" incidents. And while those stories are chilling, the book we're diving into today focuses on something almost more frightening: a near-disaster that happened right on American soil, in plain sight. We're talking about Command and Control by Eric Schlosser. Kevin: Oh, I know that name. Isn't he the guy who wrote Fast Food Nation? That book that ruined french fries for me forever? Michael: The very same. He's an incredible investigative journalist, and what he did for the fast-food industry, he does here for the nuclear arsenal. He has this unique talent for digging into these massive, hidden systems that run our world and showing us the terrifying cracks in the foundation. The book was even a finalist for the Pulitzer Prize, which tells you the level of detail we're dealing with. Kevin: Wow. So he went from Big Macs to nuclear bombs. That’s quite a career jump. Michael: It is, but the core skill is the same: uncovering hidden truths. And the central question Schlosser poses is bone-chillingly simple: how do you deploy weapons of mass destruction without being destroyed by them? Kevin: That sounds like a riddle with no answer. So where does a story like this even begin? Michael: It begins in the most mundane way possible. In a concrete hole in the ground, in rural Arkansas, with a young airman and a socket wrench.

The Illusion of Control: The Damascus Accident

SECTION

Kevin: A socket wrench. You're telling me a regular tool from a standard-issue toolbox is the villain of this story? Michael: It’s the catalyst. Picture this: it’s September 1980. Deep in the countryside near a tiny town called Damascus, Arkansas, there's a Titan II missile silo. This isn't just any missile; the Titan II is a monster. It’s an Intercontinental Ballistic Missile, over 100 feet tall, carrying the most powerful nuclear warhead in the US arsenal at the time—the W53. Kevin: Okay, put that in perspective for me. How powerful is 'most powerful'? Michael: The W53 warhead had a yield of nine megatons. To give you a comparison, the bomb dropped on Hiroshima was about 15 kilotons. This was 600 times more powerful. It was designed not just to destroy a city, but to wipe a hardened, underground Soviet command bunker off the face of the Earth. It was a city-killer, an apocalypse in a can. Kevin: Whoa. And this thing is just sitting in a field in Arkansas? Michael: Sitting in a heavily fortified underground silo, yes. And on this particular day, a couple of young Air Force technicians are doing routine maintenance. One of them, a senior airman named David Powell, is on a platform high up inside the silo, right next to the missile. He's using a heavy socket wrench to tighten a pressure cap. Kevin: I think I see where this is going, and I don't like it. Michael: He’s working in a tight space. The socket, which weighs about three pounds, slips from the wrench. It falls. It tumbles down the side of the missile, a drop of about 70 feet. Then, the crew hears a loud thud as it hits the base of the missile. And then, a hissing sound. Kevin: Oh man. What did it hit? Michael: It punctured the missile's first-stage fuel tank. Now, this is the critical part. The Titan II used liquid propellants that were hypergolic. This means the moment the fuel and the oxidizer come into contact with each other, they ignite instantly. No spark needed. They are also incredibly toxic and corrosive. So now, you have thousands of pounds of volatile rocket fuel pouring into the bottom of the silo. Kevin: This is the 'human fallibility' thing in action, isn't it? It’s not some grand, complex sabotage plot. It’s just… a guy dropped a tool. It's gravity. Michael: Exactly. It’s the most human mistake imaginable. But it collides with what Schlosser calls 'technological complexity.' This system is so powerful and so volatile that there's no room for simple human error. The technicians immediately know the gravity of the situation. They don a suit to go in and check the damage, but the vapor levels are off the charts. The entire silo is now a bomb waiting for a spark. Kevin: But surely there were safety systems, right? Alarms, automatic vents, fail-safes? Michael: There were. But the situation was escalating in a way that defied the manuals. They tried to vent the silo, but it wasn't working fast enough. The pressure was building. For eight agonizing hours, the command teams on the surface tried to figure out what to do. They evacuated the surrounding area, but they were in uncharted territory. No one had ever written a procedure for "What to do when a socket wrench punctures your ICBM." Kevin: So they're just improvising, while sitting on top of a nine-megaton warhead? That's horrifying. Michael: It is the absolute definition of the illusion of control. We build these systems with immense confidence, with binders full of procedures. We believe we have mastered the technology. But then a three-pound piece of metal falls, and all those binders become useless. The situation becomes, as Schlosser puts it, an "unlikely event that becomes unavoidable." Kevin: So what happened after those eight hours? Michael: The inevitable. The volatile fuel vapors finally ignited. A massive explosion ripped through the silo. The force was so immense it blew the 740-ton concrete and steel launch door—a structure designed to withstand a nearby nuclear blast—right off its hinges and tossed it hundreds of feet into the air. Kevin: And the warhead? Michael: The explosion ejected the entire second stage of the missile, with the W53 nuclear warhead still attached, out of the silo. It flew up into the air and landed in a ditch about 100 yards away. One of the airmen sent to check on the situation was killed, and 21 others were injured. The silo was completely destroyed. Kevin: That is absolutely insane. But the warhead… it didn't detonate? Michael: It didn't. And that’s the one sliver of good news in this whole nightmare. The safety mechanisms on the warhead itself, the things designed to prevent it from exploding in a fire or an impact, held. A few switches worked as designed. The United States, and specifically the state of Arkansas, avoided a nuclear catastrophe not because of brilliant planning or perfect control, but because of a few last-ditch safety features and an enormous amount of dumb luck. Kevin: Wow. That story is terrifying. It really drives home the point about human error. But it also makes me wonder, why? Why were we using something so ridiculously dangerous in the first place? What was the logic behind having a liquid-fueled missile that could be punctured by a common tool? Michael: That's the perfect question. And it leads us directly to the second core idea of the book: the impossible, unwinnable paradox at the heart of all nuclear strategy.

The 'Always/Never' Paradox

SECTION

Michael: The reason the Titan II was so dangerous is the same reason the military loved it. Schlosser calls this the 'Always/Never' paradox. A nuclear weapon has to be designed to always work, instantly, on command. But it also has to be designed to never detonate accidentally or without authorization. Kevin: And those two goals are in direct conflict with each other. Michael: Completely. To be ready 'always,' the Titan II needed to be able to launch within minutes of an order. This was the height of the Cold War, and the fear was a surprise Soviet first strike. We needed to be able to launch our missiles before they were destroyed in their silos. That required using that hypergolic liquid fuel, which was incredibly unstable but kept the missile in a constant state of readiness. Kevin: It’s like keeping your car engine running at redline, 24/7, just in case you need to peel out of the driveway at a moment's notice. Michael: That's a perfect analogy. It’s incredibly effective if you need to go, but it puts an immense, constant strain on the system and dramatically increases the chance of a catastrophic failure. A solid-fuel missile, which is what we primarily use now, is much more stable and safe. It's like a cold engine. It takes a bit longer to get going, but it's not going to explode while it's sitting in the garage. Kevin: So, we were basically forced to choose between being ready for war and being safe from ourselves. And the choice was to lean into readiness, and just cross our fingers about the safety part. Michael: That was the gamble of the Cold War. The 'always' part of the paradox often won out over the 'never' part. The military needed absolute certainty that if the President gave the order, the missile would fly. The people who designed the safety systems, on the other hand, were trying to build in layers of protection to prevent an accident. Schlosser's book is filled with stories of the tension between these two groups—the operators and the safety engineers. Kevin: It sounds like a philosophical tug-of-war with the fate of the world as the rope. Michael: It was. And Schlosser, with his journalistic background, is brilliant at showing this isn't just an abstract technical problem. He tells the stories of the ordinary servicemen, the bomber pilots, the missile commanders, the maintenance crews like David Powell. These were often very young men, in their late teens and early twenties, who were given the responsibility of managing these apocalyptic weapons on a daily basis. They were the ones living on the front line of the 'Always/Never' paradox. Kevin: That adds such a human element. It’s not just generals in a war room, but kids in a concrete hole trying not to drop a wrench. Michael: And that’s why the book is so powerful and was so widely acclaimed. It takes this high-level strategic dilemma and shows you what it felt like on the ground. It reveals how secrecy and bureaucracy often made things worse. Warnings from engineers about design flaws were sometimes ignored because admitting a flaw would mean taking a powerful weapon system offline. Kevin: So the illusion of control wasn't just a psychological bias, it was an institutional necessity. You had to project confidence, even if you knew the whole thing was held together with duct tape and hope. Michael: Precisely. And the Damascus incident was the ultimate proof. It was a moment where the 'never' part of the equation almost failed catastrophically. The system wasn't perfectly safe. It was vulnerable to the simplest, most predictable kind of failure: human error.

Synthesis & Takeaways

SECTION

Kevin: So when you put it all together, the dropped wrench and this 'always/never' idea, what’s the big takeaway? Is the message that nuclear weapons are just a terrible idea? Michael: I think Schlosser's conclusion is a bit more nuanced and, in a way, more alarming. The Damascus story isn't just a historical anecdote; it's a parable for our relationship with complex, high-risk technology. The final, terrifying insight is that we were saved by luck. The warhead was violently thrown from the silo, subjected to a massive explosion and fire, and it didn't detonate because a few simple, analog safety switches worked. Kevin: We got lucky. One of those switches fails, and the history of the 20th century looks very different. Michael: Exactly. And Schlosser's ultimate point, which is just as relevant today as it was in 1980, is that any system that relies on luck to prevent the apocalypse is a fundamentally broken system. We still have thousands of nuclear weapons. The technology has improved, the command systems are more sophisticated, but the two core risk factors remain: human fallibility and technological complexity. There will always be another dropped wrench. Kevin: That’s a chilling thought. It makes you wonder how many other 'dropped wrenches' we've never heard about. How many other near-misses were quietly covered up? How many times have we just gotten lucky? Michael: It's a deeply sobering question. Schlosser's work is a powerful reminder that the danger hasn't passed just because the Cold War ended. The weapons are still here. The paradox is still here. And the illusion of control is as tempting as ever. Kevin: It really makes you think. This story is going to stick with me for a while. Michael: It should. It’s a story we all need to hear. We'd love to know what our listeners think. Does hearing about the Damascus accident change how you view these systems? Find us on our social channels and join the conversation. Kevin: This is Aibrary, signing off.

00:00/00:00