Aibrary Logo
Podcast thumbnail

Why We Fear the Wrong Things

14 min

How to Make Good Decisions

Golden Hook & Introduction

SECTION

Mark: A single government health warning in the UK, based on a "100% risk increase," led to an estimated 13,000 extra abortions in one year. The warning was statistically true, but dangerously misleading. Today, we're finding out why, and how to spot the same tricks being used on us. Michelle: Hold on, 13,000? From one warning? That sounds impossible. How could a single piece of health advice cause that much chaos? Mark: It's not just possible, it’s a perfect example of a problem that affects all of us, every single day. This comes from the work of Gerd Gigerenzer, a German psychologist and a director at the Max Planck Institute, in his book Risk Savvy: How to Make Good Decisions. Michelle: Gigerenzer. I’ve heard that name. He’s a bit of a maverick, isn’t he? Mark: Absolutely. He pushes back hard against the popular idea that humans are just hopelessly irrational. He argues the problem isn't our brains; it's that we live in a risk-illiterate society. And the consequences, as we just heard, are devastating. Michelle: Okay, you have my full attention. A risk-illiterate society. Let's start there. How does one misleading number lead to thousands of abortions?

The Illusion of Certainty & The Dangers of Misleading Numbers

SECTION

Mark: It all comes down to a simple statistical trick. In 1995, the UK's Committee on Safety of Medicines issued an urgent warning to doctors and the media. They announced that the newer, third-generation contraceptive pill doubled the risk of thrombosis, or life-threatening blood clots. A 100% increase in risk. Michelle: A hundred percent! I mean, hearing that, I would panic. I’d stop taking it immediately. That sounds terrifyingly clear. Mark: And that’s exactly what happened. Women across Great Britain stopped taking the pill in droves. But here’s the part of the number they didn't emphasize. The risk didn't double from something high to something catastrophic. The absolute risk increased from 1 in 7,000 women to 2 in 7,000 women. Michelle: Oh. Wow. So, it went from incredibly rare to... still incredibly rare. Mark: Precisely. The "100% increase" is what we call a relative risk. It sounds huge and makes for a great headline. But the absolute risk increase was just one single extra case for every 7,000 women. Meanwhile, the risk of getting a blood clot from pregnancy itself is much higher—about 6 in 7,000. Michelle: That is infuriating. So women, trying to make a safe health decision, abandoned a very low-risk contraceptive and ended up with unwanted pregnancies, which carried an even higher risk of the very thing they were trying to avoid! Mark: Exactly. The result was an estimated 13,000 additional abortions in England and Wales the following year, plus a surge in teen pregnancies. All because of a number that was technically true but presented in the most misleading way possible. Gigerenzer argues this isn't a rare case; it's the standard playbook. Michelle: It feels like a magic trick, designed to deceive. Why would health authorities even do that? Mark: Gigerenzer points to a few things. First, the media loves scary numbers. "Risk Doubles!" is a much better story than "Slightly More People Might Face a Tiny Risk." But also, the committee was likely engaging in defensive decision-making. If they didn't sound the alarm as loudly as possible and someone died, they'd be blamed. So they create panic to protect themselves. Michelle: And this isn't just a one-off thing with a single pill scare, is it? Mark: Not at all. Gigerenzer gives another perfect, everyday example: the weather forecast. When a forecaster says there's a "30 percent chance of rain," what does that actually mean? Michelle: I have no idea! I always assumed it meant it would be drizzling for about 30% of the day, or maybe that 30% of my neighborhood would get rain. Mark: And you're not alone! He found that people in Berlin thought the same things. But in New York, people had a different interpretation. They thought it meant that on 30% of the days when that exact forecast is made, it will rain. Michelle: Okay, that's completely different. Which one is right? Mark: The New York interpretation is what the forecasters actually mean. But they never tell you! They just say "30 percent chance" and let our brains fill in the rest. Gigerenzer's simple rule is to always ask the question: Percent of what? It’s the key that unlocks the trick. A 100% increase of what? A 30% chance of what? Without that reference, the number is meaningless. Michelle: That’s so simple but so powerful. It feels like we're being set up to fail. We're not just bad at math; we're being fed confusing information. But what about situations where the risk isn't a confusing statistic, but a real, visceral fear? Like after a major disaster.

Risk vs. Uncertainty & Why We Fear the Wrong Things

SECTION

Mark: That's the perfect question, because it leads to Gigerenzer's next crucial idea: the difference between risk and uncertainty. He says we often confuse the two, and that's where our biggest mistakes happen. Michelle: What’s the difference? They sound like the same thing. Mark: He makes a very sharp distinction. Risk is a world where you know all the possible outcomes and their probabilities. Think of a casino game, like roulette. You know exactly the odds of landing on red or black. You can calculate the risk. Michelle: Okay, a known set of rules. Mark: Right. But uncertainty is a world where you don't know all the outcomes, and you certainly don't know their probabilities. This is the real world. Think about predicting the stock market, or who you should marry, or what to do in a crisis. You can't calculate the odds. Michelle: So, most of life is uncertainty, not risk. Mark: Exactly. And the tools for dealing with them are different. For a world of risk, you can use statistics and probability. For a world of uncertainty, you need something else: good intuition and simple, robust rules of thumb, or what he calls heuristics. Michelle: Can you give an example of that in action? Mark: The best one is the "Miracle on the Hudson." In 2009, US Airways Flight 1549 hit a flock of geese just after takeoff, and both engines failed. Captain "Sully" Sullenberger had just a few minutes to decide what to do. He was in a world of pure uncertainty. Michelle: I remember that story. It was incredible. He landed the plane in the Hudson River, and everyone survived. Mark: But how did he decide he could make it to the river but not back to the airport? He didn't pull out a calculator and run complex physics equations. He used a simple rule of thumb that pilots learn. Copilot Jeffrey Skiles explained it later. He said it's a visual rule: if you're looking at a potential landing spot, and it's rising in your windshield, you won't make it. If it's descending in your windshield, you'll fly over it. You need to find a spot that stays constant in your view. Michelle: Wow. That's it? A simple visual trick? Mark: A simple rule for an uncertain world. In that moment, a complex calculation would have been useless and deadly. A smart heuristic saved 155 lives. That’s Gigerenzer's point: in uncertainty, simple is often better. Michelle: That’s a powerful story of getting it right. But what happens when our intuition gets it horribly wrong in the face of uncertainty? Mark: That brings us to the tragic flip side of the coin: the aftermath of 9/11. The attacks created immense fear and uncertainty. And how did people react? They stopped flying. Michelle: Of course they did. It felt like the most dangerous thing in the world. I remember that feeling. Driving feels safe, like you're in control. Mark: And that feeling of control is an illusion. In the twelve months after the 9/11 attacks, so many Americans chose to drive instead of fly that fatal traffic accidents surged. Gigerenzer cites the data: an estimated 1,600 additional Americans lost their lives on the road in that year, trying to avoid the risk of flying. Michelle: Oh my god. That’s more than half the number of people who died in the attacks themselves. Mark: It is. Gigerenzer calls this the terrorists' "second strike." They didn't have to lift another finger; our own fear did the work for them. This happens because of what he calls "dread risk." Our brains are evolutionarily wired to have an outsized fear of events that are low-probability but kill many people at once in a spectacular way. We're not wired to fear the slow, steady, one-by-one danger of car accidents, even though it's statistically far deadlier. Michelle: So our gut feeling, our intuition, which worked so well for Captain Sully, completely failed the general public after 9/11. Mark: Because it was triggered by the wrong instinct. The fear of dread risk is a powerful, ancient button in our brains. And it's a button that's pushed not just by terrorists, but by institutions that want to control us, which leads right into the final big idea.

Defensive Decisions & The Power of a Positive Error Culture

SECTION

Michelle: Okay, so this fear of dread risk and the desire to feel in control... how does that play out in our institutions? Mark: It creates what Gigerenzer calls a culture of defensive decision-making. This is where a person or an organization knows what the best option is, but chooses an inferior one simply to protect themselves from blame if something goes wrong. Michelle: That sounds like it happens everywhere. Choosing the 'safe' option to cover your own back, even if it's not the best choice for the company or the client. Mark: It's an epidemic, especially in medicine. Gigerenzer quotes an airline risk manager who said, "If we had the safety culture of a hospital, we would crash two planes a day." Michelle: That is a terrifying statement. What’s the difference? Mark: Aviation has a positive error culture. When a plane nearly crashes, it's seen as a learning opportunity. The black box is analyzed, pilots are debriefed without punishment, and the entire system is updated to prevent it from happening again. They embrace error to get safer. Michelle: And medicine? Mark: Medicine, he argues, often has a negative error culture. Errors are hidden because of the fear of lawsuits and blame. A doctor who makes a mistake is more likely to be punished than studied. And this leads to defensive medicine—ordering tons of unnecessary tests, suggesting invasive procedures, not because it's best for the patient, but to create a paper trail of protection against a potential lawsuit. Michelle: So doctors are making decisions based on fear, not just science. Mark: A survey in Pennsylvania found that 93% of doctors in high-risk specialties admitted to practicing defensive medicine. The most powerful story in the book on this topic is about Dr. Peter Pronovost at Johns Hopkins Hospital. He was horrified by the number of patients dying from simple bloodstream infections caused by central line catheters. Michelle: Is that common? Mark: At the time, it was causing up to 28,000 deaths a year in US ICUs. So Pronovost created a ridiculously simple five-step checklist for doctors to follow when inserting a line. The steps were basic: wash hands, clean the patient's skin, use sterile drapes, wear a mask and gown, and put a sterile dressing on after. Michelle: That sounds like stuff they should have been doing anyway. Mark: Exactly! But in the rush of the ICU, steps were being skipped. So he made the checklist and asked the hospital to do something radical: empower the nurses to stop a doctor, any doctor, if they saw them skip a step. Michelle: I can imagine that went over well with the surgeons. Mark: You can imagine. But the administration agreed. And the results were staggering. In his ICU, the line infection rate dropped from 11 percent to zero. Zero! In the next 15 months, they had only two infections. His program, when rolled out across Michigan, is estimated to have saved 1,500 lives and $100 million in just 18 months. Michelle: That's a miracle. So every hospital in the world adopted it immediately, right? Mark: That's the tragic part. No. He faced immense resistance. Doctors felt it was insulting to their expertise. Hospitals were slow to change. It was a perfect example of institutional ego and a negative error culture fighting against a simple, life-saving solution. It wasn't about a lack of knowledge; it was about a culture of blame and hierarchy.

Synthesis & Takeaways

SECTION

Michelle: Wow. So after all this—being tricked by stats, being driven by the wrong fears, and being treated by doctors who are afraid of being sued—it feels a bit hopeless. What's the one thing we should actually do to become 'risk savvy'? Mark: Gigerenzer boils it down to a few simple, powerful rules. It's about asking two simple questions. First, when you hear a statistic, especially a percentage, always ask: 'Percent of what?' That simple question cuts right through the trick used in the Pill Scare. Michelle: Okay, that's one. What's the other? Mark: Second, when you're facing a big decision, especially in medicine, and you feel the doctor might be practicing defensively, ask them: 'What would you do if it were your mother?' or your child. He tells a story about his own mother, where the doctor recommended a treatment, but when asked that question, admitted he would tell his own mother to wait. It cuts through the self-protection and gets to their true, expert opinion. Michelle: That’s brilliant. It reframes the relationship from a legal liability to a human one. Mark: Exactly. Gigerenzer's ultimate point is that becoming risk savvy isn't about becoming a professional statistician. It's about having the courage to think for yourself, to question authority, and to reclaim your own judgment from a world that often profits from our fear and confusion. It's about realizing that sometimes the simplest questions are the most powerful tools we have. Michelle: It makes you wonder, where have you seen these tricks or these defensive decisions in your own life? At work, at the doctor's office, in the news you read. Once you see the pattern, you can't unsee it. Let us know your stories. We'd love to hear them. Mark: This is Aibrary, signing off.

00:00/00:00