Aibrary Logo
Podcast thumbnail

Weaponized Lies

10 min

How to Think Critically in the Post-Truth Era

Introduction

Narrator: In December 2016, a 28-year-old man named Edgar Welch drove 350 miles to Washington, D.C., armed with a semiautomatic weapon. He walked into a family pizzeria called Comet Ping Pong and fired his rifle. Patrons and employees scrambled for cover, terrified. Welch wasn't a random madman; he was on a mission. He had read online that the pizzeria was the headquarters of a child-sex ring run by Hillary Clinton. He was there to "self-investigate." Of course, the story was a complete fabrication, a lie that had metastasized across social media. Welch found no evidence, but the danger he created was terrifyingly real. This incident is a stark reminder of the world we now inhabit, where falsehoods can be weaponized with devastating consequences.

How do we arm ourselves against this onslaught of misinformation? In his book, Weaponized Lies: How to Think Critically in the Post-Truth Era, neuroscientist Daniel J. Levitin provides a field manual for navigating this new reality. He argues that the best defense is not to retreat from information, but to develop the critical thinking skills necessary to evaluate it, whether it comes in the form of numbers, words, or expert claims.

Statistics Are Interpretations, Not Facts

Key Insight 1

Narrator: Before diving into complex analysis, Levitin argues that our first line of defense against bad data is a simple plausibility check. Numbers can feel authoritative, but they are often just interpretations, and a quick "back-of-the-envelope" calculation can reveal when they are absurd. For instance, a widely reported statistic once claimed that 150,000 girls and young women die from anorexia each year in the United States. This number sounds tragically high and is emotionally potent. However, a quick check of public data reveals a startling fact: the total number of deaths from all causes for girls and women in that age group is only about 55,000 per year. The claim is not just wrong; it's impossible.

This same principle applies to claims of exponential growth. A story might claim that the number of marijuana smokers in California has doubled every year for the past 35 years. Starting with just one smoker, this doubling would result in over 17 billion smokers—more than twice the population of the entire planet. These claims are designed to bypass our logic by leveraging the authority of numbers. Levitin shows that we don't need to be statisticians to debunk them; we just need to pause, engage our common sense, and ask a simple question: "Does this number even make sense?"

Averages and Graphs Lie by Omission

Key Insight 2

Narrator: One of the most common ways numbers are used to mislead is through the use of averages. The word "average" can refer to the mean, median, or mode, and each can tell a very different story. Levitin illustrates this with the example of a small startup. The company has four programmers earning $70,000, one receptionist earning $50,000, and three founders who each take a $170,000 salary. The mean salary for the whole company is a respectable $97,500. However, the median—the middle value—is only $70,000. A dishonest founder could use the higher mean to attract new talent, while a disgruntled employee could use the lower median to argue for a raise. The lie is not in the number itself, but in what is omitted.

This deception becomes visual with graphs. A classic example is the truncated y-axis. In 2012, Fox News displayed a graph showing what would happen if the Bush tax cuts expired. The top tax rate would rise from 35% to 39.6%. By starting the graph's vertical axis at 34% instead of zero, the bar representing the new rate appeared six times taller than the original, visually implying a 600% tax hike. In reality, it was a modest 13% increase. As Levitin points out, graphs are powerful because our brains are wired to see patterns, but this also makes us vulnerable to visual manipulation. The most important question to ask when looking at a graph is often about what is not being shown.

Correlation Is Not Causation, and Samples Are Often Flawed

Key Insight 3

Narrator: The human brain is a pattern-matching machine, which can lead us to see connections where none exist. This is the foundation of the "correlation does not imply causation" rule. Levitin points to the humorous work of Tyler Vigen, who found a strong statistical correlation between the number of films Nicolas Cage releases each year and the number of people who drown in pools. Does watching Nicolas Cage movies make people want to go for a fatal swim? Or do tragic drownings inspire him to make more films? Of course not. It's a spurious correlation, a coincidence. Yet, we see this logical fallacy everywhere, from news reports to advertising, where two co-occurring events are presented as a cause-and-effect relationship.

The data itself is often built on a shaky foundation. Levitin highlights the infamous 1936 Literary Digest poll that predicted Alf Landon would soundly defeat Franklin D. Roosevelt for president. The magazine polled millions of people—a massive sample size. But their sample was drawn from their own subscriber lists, telephone directories, and car registrations. In the midst of the Great Depression, this meant they polled a disproportionately wealthy segment of the population, which was more likely to vote Republican. Roosevelt won in a landslide. The poll failed not because the sample was too small, but because it wasn't representative. This is a crucial lesson: the quality of a sample is far more important than its size.

Expertise Is Narrow and Alternative Explanations Are Crucial

Key Insight 4

Narrator: In a complex world, we must rely on experts. But Levitin cautions that expertise is highly domain-specific. He tells the story of William Shockley, a brilliant physicist who won the Nobel Prize for co-inventing the transistor. Later in life, he used his scientific credibility to promote racist eugenics theories. People assumed his genius in physics translated to sociology and genetics, but it did not. Expertise in one area does not grant expertise in all areas.

Furthermore, we must actively seek out alternative explanations for any claim. Levitin uses the fascinating case of identical twins separated at birth who, upon reuniting, discover they share dozens of bizarrely specific habits. The immediate conclusion is that these traits must be genetic. But what's the alternative explanation? Social psychologists point out that we lack a control group. We haven't compared them to two unrelated strangers who happen to look identical and were raised apart. It's possible that people who look alike are simply treated similarly by the world, shaping their personalities and habits in similar ways. Without considering this alternative, we fall for the most obvious explanation, which may not be the correct one.

Science Is a Process of Updating Probabilities

Key Insight 5

Narrator: Many people think of science as a collection of immutable facts, but Levitin argues it's better understood as a process of Bayesian reasoning—systematically updating our beliefs as new evidence comes in. Nothing is ever 100% certain; science provides probabilities. He shares a deeply personal story about his dog, Shadow, who developed a growth on his bladder. The vets suspected a highly aggressive cancer with a grim prognosis and recommended invasive procedures.

Instead of accepting this at face value, Levitin and his wife acted like scientists. They researched the prior probability—the likelihood of a dog Shadow's age and breed having this cancer. They evaluated the evidence, the risks of surgery, and the potential benefits of different treatments. They found that the statistics on survival were skewed because many owners euthanize their pets, cutting the data short. They discovered a simple anti-inflammatory drug, Piroxicam, sometimes helped. They chose a less invasive diagnostic test and started the drug. The test was inconclusive, but Shadow's health dramatically improved. He lived happily for another 161 days, far longer than expected, without undergoing painful surgery or chemotherapy. They made the best decision they could by weighing probabilities and updating their beliefs, not by seeking an impossible certainty.

Conclusion

Narrator: The single most important takeaway from Weaponized Lies is that critical thinking is not a passive skill but an active, ongoing responsibility. In an era where information is both abundant and unregulated, the burden of verification has shifted from institutions to individuals. We can no longer afford to be passive consumers of information.

This shift shouldn't be seen as a burden, but as an empowerment. By learning to perform a quick plausibility check, to question the framing of an average or a graph, to distinguish correlation from causation, and to weigh probabilities like a scientist, we arm ourselves against manipulation. The next time you see a startling statistic or a shocking headline, don't just accept or reject it. Ask: Is it plausible? What's being left out? What's the alternative explanation? Investing a few moments in this process is the price of clarity in a post-truth world, and it is a price well worth paying.

00:00/00:00