Aibrary Logo
Podcast thumbnail

Brain Bugs & Tribal Minds

14 min

What It Is, Why It Seems Scarce, Why It Matters

Golden Hook & Introduction

SECTION

Michelle: Mark, if you had to describe the average person's ability to be rational, what's the first image that comes to mind? Mark: A squirrel trying to cross a busy street. Lots of frantic energy, a few near-misses, and a 50/50 chance it ends with a tragic, illogical splat. Michelle: That is... surprisingly accurate, and exactly what we're talking about today. It feels like we're living in an age of unprecedented scientific knowledge, yet we're surrounded by conspiracy theories, fake news, and bizarre beliefs. Mark: Right, the squirrel has a smartphone with GPS, but it's still running into traffic. It’s a paradox. Michelle: It’s the central paradox explored in the book we’re diving into today: Rationality: What It Is, Why It Seems Scarce, Why It Matters by Steven Pinker. Mark: Ah, Pinker. The famous cognitive psychologist from Harvard. I’ve heard this book got a lot of praise for its clarity, but also some criticism for being maybe a bit too optimistic about us squirrels. Michelle: Exactly. As a cognitive psychologist, he's perfectly positioned to give us the user's manual for the human brain's reasoning software. And he starts by pointing out all the places where the code is a little… buggy. Mark: Or, as he might argue, not buggy at all, just designed for a completely different operating system. Michelle: Precisely. And he kicks things off by showing us just how flawed our intuitions can be, often with puzzles that seem deceptively simple.

The Rationality Paradox: Our Brain's Surprising Bugs

SECTION

Michelle: Let’s start with one of his classic examples, a logic puzzle called the Wason Selection Task. I’ll give you the easy version first. Mark, imagine you’re a bouncer at a bar. Your only job is to enforce one rule: "If a person is drinking alcohol, they must be over 21." Mark: Okay, I can handle that. I've been training my whole life for this. Michelle: You look out at four patrons. Patron one is drinking a beer. Patron two is drinking a Coke. Patron three is clearly an old man, let's say 70 years old. And patron four is a teenager, maybe 16. To enforce the rule, whose ID do you absolutely have to check? Mark: Easy. The person drinking beer, to make sure they're old enough. And the 16-year-old, to make sure they're not drinking alcohol. The Coke drinker and the old man are irrelevant. Michelle: Perfect. You, and most people, get that right instantly. Our brains are wired for social contracts, for catching cheaters. Now, let me give you the exact same logic puzzle, but with abstract symbols, the way it was originally designed. Mark: Uh oh. I feel a trap. Michelle: The rule is: "If a card has a vowel on one side, then it has an even number on the other." You see four cards on a table. They show: A, D, 4, and 7. Which cards do you need to turn over to see if the rule is being violated? Mark: Hmm. Okay. Well, 'A' for sure, to see if there's an even number on the back. And... maybe the '4'? To see if there's a vowel on the other side? Michelle: And that's where most people go wrong. The correct answer is 'A' and '7'. Mark: Wait, why the 7? Michelle: Because if you turn over the 7 and find a vowel on the other side, the rule "If vowel, then even number" is broken. Turning over the '4' doesn't matter. The rule doesn't say that only vowels can have even numbers on the back. Mark: Wow. That is the same logic, but my brain completely short-circuited. Why is that so much harder? Michelle: Pinker argues it's because our minds didn't evolve to solve abstract logic puzzles. They evolved to navigate social worlds—to spot someone drinking beer who might be underage. The "bug" isn't a flaw in our hardware; it's that we're running the wrong software for the problem. Mark: So our rationality is context-dependent. It's not a universal tool we can just apply to anything. Michelle: Exactly. And it gets even weirder when we move from logic to probability. Are you familiar with the Monty Hall problem? Mark: The game show one with the doors and the goats? Yeah, I know this one. You pick a door, the host shows you a goat behind another door, and you have the choice to stick with your door or switch. Michelle: So what do you do? Stick or switch? Mark: You stick. Or switch. It doesn't matter. There are two doors left, one has a car, one has a goat. It's a 50/50 shot. Simple. Michelle: And that, Mark, is what nearly everyone, including PhDs in mathematics, thought when the puzzle was first popularized. And it is completely, spectacularly wrong. Mark: Come on. How can it not be 50/50? There are two doors! Michelle: This is the beauty of it. Pinker explains it perfectly. Let's scale it up. Imagine there are 1,000 doors. You pick Door #1. You have a 1-in-1000 chance of being right. Mark: Okay, a very small chance. Michelle: Now, Monty Hall, who knows where the car is, opens 998 other doors, showing you a goat behind every single one. He leaves just one other door closed—say, Door #752. He asks you, "Do you want to stick with your original 1-in-1000 shot, Door #1, or switch to Door #752?" Mark: Oh. Well, when you put it like that... I'm switching to Door #752. Immediately. Michelle: Why? Mark: Because my initial choice was almost certainly wrong. By opening all those other doors, Monty has basically concentrated all the remaining probability—that 999-out-of-1000 chance—onto that one other door he left closed. Michelle: Exactly! And the logic is identical for the three-door problem. Your initial pick has a 1/3 chance of being right. The other two doors combined have a 2/3 chance. When Monty reveals a goat, he isn't changing the initial odds. He's giving you new information. He's concentrating that 2/3 probability onto the single remaining door. You should always switch. Mark: My brain just broke a little bit. It feels so wrong, but the 1000-door example makes it perfectly clear. Our intuition about probability is just terrible.

The Social Hijacking of Reason: Myside Bias

SECTION

Mark: Okay, so our individual brains have these weird glitches. We're bad at abstract logic and even worse with probability. But it gets way worse when we get into groups, right? That's where the real irrationality seems to explode. Michelle: That's the next major theme in Pinker's book. He argues that some of the most dangerous irrationality isn't due to a lack of intelligence, but to something he calls the "myside bias." Mark: What exactly is 'myside bias'? Is it just being stubborn or having confirmation bias? Michelle: It's deeper than that. Confirmation bias is when you look for evidence that supports your belief. Myside bias is when you use your entire reasoning faculty—your intelligence, your logic, your creativity—not to find the truth, but to justify a conclusion that benefits your "side" or your tribe. It’s about winning the argument for your team. Mark: So reason becomes a weapon for tribal warfare, not a tool for discovery. Michelle: Precisely. And this leads to what Pinker calls a "Tragedy of the Rationality Commons." For any individual, it can be perfectly rational to signal loyalty to their group by believing what the group believes. But when everyone does this, the collective—our society—becomes dangerously detached from reality. Mark: That sounds terrifyingly familiar. Do we have a real-world example of this? Michelle: Pinker uses one of the most bizarre and tragic examples from recent history: the "Pizzagate" conspiracy theory. Mark: Oh man, I remember this. This was wild. Michelle: For those who don't know, in 2016, a completely baseless rumor started online. It claimed that a pizzeria in Washington D.C., Comet Ping Pong, was the headquarters of a child sex trafficking ring run by high-level Democratic party officials, including Hillary Clinton. Mark: And there was zero evidence for this. None. It was pure fiction, born on the darkest corners of the internet. Michelle: Yet, because it confirmed the myside bias of people who already believed that Democrats were evil and corrupt, it spread like wildfire. People weren't evaluating the evidence; they were embracing a story that fit their tribe's narrative. And it had horrifying real-world consequences. Mark: Right, the man who decided to take matters into his own hands. Michelle: Exactly. A man from North Carolina, Edgar Maddison Welch, believed the conspiracy so fervently that he drove hundreds of miles to the pizzeria armed with an AR-15 rifle. He walked in and fired shots, determined to "self-investigate" and rescue the non-existent child prisoners. Mark: It's a miracle no one was killed. But it's the perfect, chilling example. This wasn't a failure of intelligence; Welch was smart enough to plan a trip and acquire a weapon. It was a failure of rationality, completely hijacked by his tribal identity. Michelle: Pinker would call this "expressive rationality." Welch's belief wasn't about a factual assessment of the world. It was a performance, a way of expressing his identity as a soldier for his side. And when that happens on a mass scale, our shared reality starts to crumble.

Forging a Better Mind: The Tools and Institutions of Reason

SECTION

Michelle: It sounds bleak. Our brains have glitches, and our social instincts turn reason into a weapon. But Pinker is ultimately an optimist. He argues we're not doomed to be irrational squirrels. We've developed powerful tools to overcome these glitches. Mark: Okay, I need some hope here. What are these tools? Are we all supposed to go get PhDs in logic? Michelle: Not at all. Sometimes, the fix is surprisingly simple. It's about reframing the problem to work with our cognitive strengths instead of against them. Let's go back to probability. Pinker gives a classic example from medical diagnosis that trips up even doctors. Mark: I'm ready to be wrong again. Hit me. Michelle: Okay. The probability that a woman of a certain age has breast cancer is 1%. That's the base rate. If a woman has breast cancer, the probability that she gets a positive mammogram is 90%. That's the sensitivity. And if a woman does not have breast cancer, the probability she still gets a positive mammogram—a false positive—is 9%. Mark: Okay, so the test is pretty good, but not perfect. Michelle: Right. Now, a woman from this group gets a positive mammogram. What is the probability that she actually has breast cancer? Mark: Hmm. The test is 90% accurate... so... maybe 80 or 90 percent? It seems high. Michelle: That's what most doctors guess. The correct answer is 9%. Mark: Nine?! How is that possible? That's shockingly low. Michelle: It's because we neglect the base rate. The disease is very rare to begin with. Our brains latch onto the 90% accuracy and ignore the 1% prevalence. But watch what happens when we reframe this using what Pinker calls "natural frequencies." Mark: Okay, I'm listening. Michelle: Imagine 1,000 women. Based on the 1% prevalence, how many have cancer? Mark: Ten. Michelle: Right. And of those 10 women with cancer, the test is 90% sensitive, so how many get a correct positive result? Mark: Nine. Michelle: Now for the other 990 women who don't have cancer. The false positive rate is 9%. How many of them get an incorrect positive result? Mark: Uh... 9% of 990 is... about 89. Michelle: Exactly. So, in total, we have 9 true positives and 89 false positives. That's 98 women who tested positive. Of those 98, how many actually have cancer? Mark: Nine. So 9 out of 98. Which is... wow, just over 9%. Michelle: See? The moment we stop using confusing percentages and start using concrete numbers of people, the problem becomes intuitively simple. A ten-year-old can solve it. This is one of Pinker's key arguments: we can become more rational by presenting information in a way our minds can actually process. Mark: That's incredible. So a huge part of promoting rationality isn't about making people smarter, but about communicating information better.

Synthesis & Takeaways

SECTION

Mark: So it seems like rationality isn't a switch you just flip on or off. It's a skill, a muscle we have to train. And even then, we need the right environment—the right institutions, the right way of talking about problems—to even use it properly. Michelle: Exactly. Pinker's ultimate point, and this is where the book really lands its punch, is that rationality is a moral commitment. It’s about valuing truth over tribal loyalty. It’s about having the humility to admit you might be wrong. Mark: And that's why he talks about institutions. It's not just on us as individuals. Michelle: Right. Institutions like science, with its rules of peer review and falsifiability, or a free press with fact-checkers, or a democracy with checks and balances—these are our collective attempts to build a "rationality commons." They are systems designed to protect us from our own worst cognitive instincts, from our myside bias. They force us to confront other perspectives. Mark: So while the book is full of these fun and mind-bending puzzles, the real message is much bigger. It's a defense of the entire Enlightenment project. Michelle: It is. And it's a call to action. Pinker argues that these institutions are fragile and under attack, and we have a responsibility to defend them. The book received some criticism for this optimistic, pro-Enlightenment stance, with some reviewers feeling it downplays the ways reason itself can be used to justify oppression. But Pinker's core argument is that abandoning reason is far more dangerous. Mark: So maybe the one thing listeners can take away is, next time they feel that rush of certainty or that flash of anger at an opposing viewpoint, just to pause and ask: "Am I trying to understand this, or am I just trying to win for my team?" Michelle: That's a perfect takeaway. It’s about shifting from the mindset of a soldier, defending a position, to the mindset of a scout, trying to map the territory as accurately as possible. Mark: I like that. The scout mindset. It's a great image to hold onto. Michelle: We'd love to hear your thoughts. What's the most irrational thing you've seen someone believe, and why do you think they believed it? Let us know on our socials. We're always curious. Michelle: This is Aibrary, signing off.

00:00/00:00