
Bombing Agrabah
15 minThe Campaign against Established Knowledge and Why It Matters
Golden Hook & Introduction
SECTION
Olivia: A recent poll asked Americans if they would support bombing Agrabah. Nearly a third of Republicans and a fifth of Democrats said yes. The only problem? Agrabah is the fictional kingdom from Disney's Aladdin. Jackson: You are kidding me. Please tell me you're kidding. So a significant chunk of the population is ready to declare war on a cartoon? That’s… both hilarious and deeply, deeply unsettling. Olivia: It’s the perfect, if terrifying, entry point into what we're talking about today. It’s this phenomenon of having a strong opinion without any underlying knowledge. And it’s the central crisis explored in the book The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters by Tom Nichols. Jackson: The Death of Expertise. That title alone feels so relevant it hurts. Olivia: It really is. And Nichols is a fascinating guy to write this. He's not just some academic in an ivory tower; he's a long-time professor of national security, has worked on Capitol Hill, and—get this—he's a five-time undefeated Jeopardy! champion. Jackson: Wow, okay. So when an actual, certified genius and expert on knowledge talks about the death of expertise, you should probably listen. So where does this all start? Why does it feel like everyone is so confidently wrong about everything now? Olivia: Well, Nichols argues it’s not just simple ignorance. It’s something more active, more aggressive. It’s a fundamental rejection of the very idea that one person's knowledge can be more valid than another's. And it’s why our conversations, online and in real life, have become so completely exhausting.
The Diagnosis: Why Conversation Became Exhausting
SECTION
Jackson: Okay, that hits home. That feeling when you're in an argument, you present facts, data, evidence… and the other person just says, "Well, that's just your opinion." It’s maddening. Olivia: Exactly. Nichols would say that feeling is a symptom of a much larger societal sickness. He points to a couple of key psychological drivers. The first is a cognitive bias called the Dunning-Kruger effect. Jackson: I’ve heard of this! Isn't that the thing where the people who are the worst at something think they're the best? Olivia: Precisely. It’s a paradox of incompetence. The skills you need to be good at something are the very same skills you need to recognize how bad you are at it. So, people with very low ability in a subject don't just perform poorly; they lack the self-awareness to even realize it. They're blissfully, and often loudly, unaware of their own ignorance. Jackson: That is terrifying. So the Dunning-Kruger effect is like a superpower for spreading bad ideas? The less you know, the louder and more confident you shout about it. Olivia: It’s a huge part of it. And it’s supercharged by its evil twin: confirmation bias. That’s our natural human tendency to seek out and favor information that confirms what we already believe, and to ignore anything that challenges it. We’re not looking for truth; we’re looking for backup. Jackson: Right, we’re not detectives, we’re lawyers building a case for a conclusion we’ve already reached. Olivia: Perfect analogy. And when you combine the Dunning-Kruger effect with confirmation bias, you get a dangerous cocktail. You have people who don't know what they're talking about, who are incapable of realizing they don't know, and who are actively searching for any scrap of information, no matter how flimsy, to prove themselves right. Jackson: And this is where something like the anti-vaccination movement comes from, isn't it? Olivia: It's the textbook case study Nichols uses. It’s a tragic story of how this works in the real world. It all started in 1998 with a study by a British researcher, Andrew Wakefield, published in a prestigious medical journal, The Lancet. It linked the MMR vaccine to autism. Jackson: I remember the fallout from that. It was everywhere. Olivia: It was. The problem was, the study was a complete fraud. It had a tiny sample size of just 12 children, the research was manipulated, and it was later revealed that Wakefield had been paid by a lawyer who was planning to sue vaccine manufacturers. But the damage was done. The idea was out there. Jackson: And it was a simple, scary story that parents could latch onto. Olivia: Exactly. It gave a face to a fear. And the internet, which was just becoming a household tool, acted as a massive accelerant. Parents, terrified for their children, found each other in online forums. Celebrities like Jenny McCarthy became powerful advocates, using their platforms to spread the message. They had emotional, personal stories. Jackson: And the actual experts—the scientists, the doctors—what were they doing? Olivia: They were trying to fight back with data, with large-scale studies showing no link whatsoever. But a spreadsheet is never as compelling as a mother's tearful story. The scientific community did its job—they investigated, they found the fraud, and The Lancet fully retracted Wakefield's paper in 2010. He lost his medical license. Jackson: But it was too late. The idea had already gone viral. Olivia: Completely. The belief persists to this day, and the consequences are real. We've seen outbreaks of measles, a disease that was nearly eradicated, in communities with low vaccination rates. It’s a direct outcome of rejecting overwhelming expert consensus in favor of a comforting, but dangerously wrong, narrative. It’s the Dunning-Kruger effect and confirmation bias playing out on a national health scale. Jackson: It’s that quote from the book, isn't it? "My ignorance is just as good as your knowledge." Except in this case, that ignorance can be lethal. Olivia: That’s the core of it. And what’s truly paradoxical, and what Nichols dives into next, is that the very institutions we built to combat ignorance—our universities and our information systems—are now, in many ways, pouring fuel on the fire.
The Accelerants: How Our Institutions Are Pouring Fuel on the Fire
SECTION
Jackson: Wait, that feels completely backward. How can more education and more information possibly make us dumber? Olivia: It's a fantastic question, and it's one of the most counterintuitive parts of the book. Let's start with higher education. Nichols argues that for many colleges, the mission has shifted from education to customer service. Jackson: The student is the customer, and the customer is always right. Olivia: You got it. With the intense competition for tuition dollars, many universities are more focused on keeping students happy than on challenging them. They build luxury dorms, lazy rivers, and offer grade inflation. The focus becomes validating the student's feelings and ensuring they have a pleasant "experience," rather than forcing them through the difficult, sometimes ego-bruising, process of actual learning. Jackson: So you’re paying to be told you're right, not to actually learn? That's wild. It’s like an expensive confirmation bias machine. Olivia: That's a great way to put it. It creates a sense of intellectual entitlement. Nichols tells this incredible, and frankly infuriating, story about a student on Twitter. She was a pre-med student, and she tweeted out asking for help on an assignment about the chemical weapon Sarin, but she called it "Sarin gas." Jackson: Okay, a minor mistake. Olivia: Well, an actual chemical weapons expert, a guy named Dan Kaszeta, gently replied to her, offering to help and pointing out that Sarin is a liquid, not a gas, and that it's a proper noun, so it should be capitalized. A simple, expert correction. Jackson: And how did she respond? Olivia: With absolute fury. She told him he was mansplaining, that she didn't need his help, and was incredibly insulting. She, a student, was so confident in her own half-knowledge that she couldn't bear to be corrected by a world-leading expert. That's the "customer is always right" mentality in action. It creates a fragile arrogance. Jackson: Wow. And you know, it’s interesting because this book was a bestseller, but it's also pretty polarizing. Some readers found Nichols' tone a bit elitist or curmudgeonly, like he's just an old professor yelling 'get off my lawn' at the internet. But hearing a story like that, you can almost understand his exasperation. Olivia: You can. And that brings us to the second accelerant: the internet itself. The internet is the ultimate enabler for this mindset. It gives us the illusion of knowledge without the hard work of learning. The writer Nicholas Carr had this brilliant analogy: "Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski." Jackson: I love that. We're not deep-diving into subjects anymore. We're just skimming headlines, grabbing disconnected facts, and thinking that's the same as understanding. Olivia: And the internet doesn't care if the facts are true. Nichols cites Sturgeon's Law, a concept from a science fiction author, which states that "ninety percent of everything is crap." That law applies perfectly to the internet. There's some brilliant information out there, but it's buried under a mountain of garbage, misinformation, and conspiracy theories. Jackson: So the internet is like a library where 90% of the books are filled with lies, and they're all mixed together with no card catalog. Olivia: And everyone is their own librarian, with no training. This creates a world where, as Nichols puts it, we're moving from "uninformed" to "misinformed" and now, to "aggressively wrong." We're not just empty vessels; we're vessels filled with junk, and we're fiercely protective of that junk. Jackson: Okay, I get all that. It’s a bleak picture. But let's be honest, Olivia. Experts are wrong sometimes. What about that? Doesn't that justify at least some of the distrust? Olivia: That is the million-dollar question. And Nichols confronts it head-on. He agrees that experts are fallible, and that healthy skepticism is crucial for a democracy. But he makes a critical distinction between that and the weaponized ignorance we see today.
The Stakes: When Experts Are Wrong and Why Democracy is at Risk
SECTION
Jackson: I'm glad he goes there, because if he didn't, the whole argument would feel one-sided. I mean, I remember being told for years that eggs were going to give me a heart attack, and now they're a health food. Olivia: Absolutely. And Nichols would call that an "ordinary failure of science." Science is a process of correction. We learn more, we update our conclusions. That's a sign the system is working, not that it's broken. He tells this amazing story that actually celebrates the power of the layperson to correct an expert. Jackson: Oh, I like the sound of this. A little hope. Olivia: A little. It's about a historian named Richard Jensen, a very distinguished professor, who published research in 2002 arguing that the famous "No Irish Need Apply" signs in 19th-century America were basically a myth—a story Irish-Americans told themselves. And for over a decade, the historical community largely accepted this. Jackson: Okay, so the expert consensus was that these signs didn't really exist. Olivia: Right. Until 2015, when an eighth-grader named Rebecca Fried decided to check for herself for a history project. She didn't have access to university archives, but she had Google. She started searching old newspaper databases. And she found them. First a few, then dozens, then hundreds of actual, documented "No Irish Need Apply" ads. Jackson: An eighth-grader? With Google? Olivia: An eighth-grader with Google and a curious mind. She ended up publishing her findings in a peer-reviewed academic journal, effectively debunking the work of a tenured historian. It’s a fantastic story about how skepticism and accessible information can work. Jackson: That's incredible. So, what's the problem then? Why not just let everyone challenge everything? Olivia: Because, Nichols argues, there's a universe of difference between Rebecca Fried, who did the hard work of research to correct a fact, and someone who just feels that experts are wrong. The danger isn't skepticism; it's the collapse of trust into pure cynicism, which can then be exploited for political gain. Jackson: And that’s where this stops being an academic debate and starts getting really scary. Olivia: This is where the stakes become democracy itself. Look at the Brexit campaign in the UK. One of its leaders, Michael Gove, famously said, "I think people in this country have had enough of experts." It was a core part of their strategy: dismiss the economists, the scientists, the policy wonks, and appeal directly to emotion and identity. Jackson: And it worked. Olivia: It worked. It was the same playbook Donald Trump used in his 2016 campaign. He'd say things like "The experts are terrible," and celebrate his own lack of knowledge. He famously said, "I love the poorly educated." He understood that if you can convince people that the system is rigged and the experts are all liars, then their ignorance isn't a liability; it's a badge of honor. It proves they're not one of "them." Jackson: Whoa. So it's a political strategy. You make people distrust the referees—the experts, the journalists—so they'll only trust you. And if your voters don't know what the nuclear triad is, they can't call you out for not knowing either. It's a feedback loop of ignorance. Olivia: It's a death spiral, as Nichols calls it. Citizens disengage and become more ignorant. Politicians exploit that ignorance. This makes citizens even more cynical and distrustful of expertise. And suddenly, we're no longer having a debate about the best way to solve a problem. We're having a fight about whether facts are real.
Synthesis & Takeaways
SECTION
Jackson: It really is a death spiral. It feels like we're losing the ability to have a productive public conversation about anything important, from climate change to public health. Olivia: And that's the ultimate danger. A functioning democracy depends on a shared reality, a common set of facts that we can debate. When that foundation cracks, everything built on top of it becomes unstable. It's not just about being right or wrong in an argument; it's about whether our society can solve big, complex problems if we can't even agree on what the problems are. Jackson: So what's the way out? Do we just wait for a disaster to snap us out of it? The book sounds pretty pessimistic. Olivia: Nichols is definitely a realist, and some critics do find him pessimistic. He worries that it might take a major crisis to force a change. But he does offer a path forward, and it's not about experts winning more arguments. It's about humility, on both sides. Jackson: Humility. That feels like the opposite of everything we've just been talking about. Olivia: It is. For experts, he says, it's about being better, more humble communicators. It's about admitting when they're wrong, staying in their lane, and remembering their job is to advise, not to rule. They are servants of a democratic society, not its masters. Jackson: And for the rest of us? The non-experts? Olivia: For us, it’s about cultivating our own intellectual humility. It's about resisting the urge to have an immediate, unshakeable opinion on everything. It's about having the courage to say those three scary words: "I don't know." Jackson: "I don't know, but I'm willing to learn." Olivia: Exactly. Maybe the first step isn't to win the argument, but to just ask a better question. To move from the certainty of the Jet Ski to the curiosity of the scuba diver. To be more like that eighth-grader, Rebecca Fried, and less like the angry student on Twitter. It’s a small shift, but it might be the only thing that can pull us out of the spiral. Jackson: A little more curiosity, a little less certainty. I think I can get on board with that. Olivia: This is Aibrary, signing off.