Aibrary Logo
Podcast thumbnail

How Not to Be Wrong

10 min

The Power of Mathematical Thinking

Introduction

Narrator: During the height of World War II, American military officials faced a deadly puzzle. Their bombers were returning from missions over Europe riddled with bullet holes, and they needed to add armor to protect the planes. But armor is heavy, and too much would make the planes sluggish and inefficient. The question was where to put it. The data seemed obvious: the returning planes had the most damage on the fuselage, the wings, and the tail gunner's station, while the engines were relatively untouched. The logical conclusion was to reinforce the areas that were hit most often.

But a mathematician named Abraham Wald, who had fled Nazi persecution in Europe, saw the problem differently. He told the military to put the armor where the bullet holes weren't. His profound insight was that the military was only looking at the planes that had made it back. The absence of bullet holes on the engines of the returning planes was the most important data point, because it meant that planes hit in the engine didn't return at all. This is the kind of thinking at the heart of Jordan Ellenberg's book, How Not to Be Wrong: The Power of Mathematical Thinking. Ellenberg argues that mathematics is not a collection of abstract rules but a powerful, real-world tool for reasoning—a pair of X-ray specs that reveals the hidden structures beneath the chaotic surface of our world.

Mathematics as a Tool for Deeper Reasoning

Key Insight 1

Narrator: When students in a math class ask the age-old question, "When am I going to use this?", teachers often provide unconvincing, and sometimes dishonest, answers about future career needs. Ellenberg argues that this misses the point entirely. For most people, the direct application of a definite integral or a trigonometric formula will be rare. Instead, he proposes a better analogy: mathematics is like the weight training a soccer player does. A player never lifts a barbell during a game, but the strength, flexibility, and insight gained from those drills are what make them a better athlete.

Similarly, learning mathematics builds crucial cognitive muscles. It trains the mind in logical reasoning, problem-solving, and analytical thinking. It’s a science of not being wrong, with techniques hammered out over centuries of argument and hard work. This reframes the value of math away from rote computation and toward a method of understanding the world more deeply and soundly. It's a way of thinking that, once learned, can be applied to everything from financial decisions to political debates, even if you never solve for x again.

The Danger of Survivorship Bias

Key Insight 2

Narrator: The story of Abraham Wald and the missing bullet holes is a powerful illustration of a common mental error known as survivorship bias. This is the fallacy of drawing conclusions based only on the examples that have survived a selection process, while ignoring those that did not. The military officials were looking at the surviving planes and concluding that the most-hit areas needed more armor. Wald, thinking like a mathematician, asked a crucial question: "What assumptions are you making?" He realized the data from the planes that were shot down was missing, and that missing data was the key to the puzzle. The engines on the returning planes were clean not because they were never hit, but because a single hit to the engine was catastrophic.

This bias appears everywhere. For example, when we analyze the performance of mutual funds, we often only look at the funds that are still in business. But this ignores the many funds that performed so poorly they had to shut down. One study found that including these "dead" funds dropped the average rate of return from an impressive 134.5% over ten years to a much more ordinary 8.9% per year. By focusing only on the survivors, we get a dangerously optimistic view of reality. Mathematical thinking teaches us to always ask: what am I not seeing?

The Fallacy of Linear Thinking

Key Insight 3

Narrator: A common error in reasoning is to assume that relationships are linear—that if a little of something is good, more must be better. Ellenberg demonstrates how this simple assumption can lead to absurd conclusions. For instance, a 2008 study used linear regression to predict that, based on current trends, 100% of Americans would be overweight by 2048. This "obesity apocalypse" makes for a great headline, but it's a flawed projection because it assumes the rate of increase will continue in a straight line indefinitely, which is biologically and socially nonsensical. Not every curve is a line.

This same fallacy appears in political and economic debates. The Laffer Curve, famously drawn on a napkin for a meeting with Dick Cheney and Donald Rumsfeld, illustrates that the relationship between tax rates and government revenue is nonlinear. At a 0% tax rate, the government gets no revenue. At a 100% tax rate, no one would work, so the government would also get no revenue. The optimal rate that maximizes revenue must be somewhere in between. While the curve was used to argue that tax cuts would increase revenue during the Reagan administration—a claim that didn't pan out—its core insight remains valid. It shows that assuming a simple, linear "more is better" or "less is better" approach to complex issues like taxation or social welfare is a recipe for being wrong.

The Perils of Statistical Significance

Key Insight 4

Narrator: In 2009, a neuroscientist named Craig Bennett presented a poster at a conference that seemed to show he had found brain activity in a dead Atlantic salmon. The salmon was shown pictures of humans expressing emotions, and an fMRI scan appeared to detect a response. The study was, of course, a joke, but it made a serious point. With the massive amounts of data generated by modern tools like fMRI, it's easy to find patterns in random noise. Bennett's fMRI scan divided the salmon's brain into thousands of tiny regions called voxels. By pure chance, a few of those voxels showed a correlation with the images. Without correcting for these multiple comparisons, one could conclude that a dead fish can read minds.

This highlights a major pitfall in science: the misuse of statistical significance. The infamous "Bible Code" controversy operated on a similar principle. Researchers claimed to find hidden messages about modern events by looking for equidistant letter sequences in the Torah. The results seemed statistically significant, but critics showed that the researchers had too much "wiggle room"—flexibility in how they chose names and dates. When that wiggle room was removed, the effect vanished. This demonstrates that a p-value of less than .05 is not a magical stamp of truth. Without rigorous methodology and a healthy dose of skepticism, statistics can be used to prove almost anything.

Embracing Principled Uncertainty

Key Insight 5

Narrator: In a world that craves certainty, mathematics offers something more valuable: a principled way of being uncertain. Ellenberg contrasts the binary predictions of political pundits with the probabilistic forecasts of analysts like Nate Silver. During the 2012 election, many pundits declared the race a toss-up, while Silver's model consistently gave President Obama a high probability of winning. Silver wasn't claiming to know the future; he was quantifying his uncertainty. His approach, rooted in Bayesian inference, updates beliefs based on new evidence and prior knowledge.

This method avoids the trap of false certainty. For example, a prosecutor might tell a jury there's a one-in-a-million chance that an innocent person's DNA would match a sample from a crime scene. But this is the wrong question. The jury needs to know the probability that the defendant is innocent given the DNA match. That calculation requires considering the prior probability—how likely was it that this person was the culprit in the first place? By ignoring prior probabilities, we risk being swayed by seemingly impressive statistics that are ultimately misleading. True mathematical thinking isn't about eliminating doubt, but about understanding and navigating it.

Conclusion

Narrator: The single most important takeaway from How Not to Be Wrong is that mathematical thinking is not an esoteric skill reserved for scientists and engineers; it is a fundamental extension of common sense that can be used to sharpen our reasoning and avoid common errors in judgment. It is the art of being less wrong. The book dismantles the idea that math is about finding a single, precise answer and replaces it with a more powerful vision: math as a tool for understanding structure, questioning assumptions, and quantifying uncertainty.

The ultimate challenge the book leaves us with is to apply this mindset to our own lives. It asks us to be wary of arguments that sound too simple, to question the data we're shown and, more importantly, the data we're not, and to become more comfortable with the idea that "I'm not sure" can be a more rigorous and honest answer than a confident but baseless assertion. In a world overflowing with information and misinformation, the power to not be wrong is more critical than ever.

00:00/00:00