Aibrary Logo
Podcast thumbnail

Why Your Brain Loves Lies

13 min

Falsehoods and Free Speech in an Age of Deception

Golden Hook & Introduction

SECTION

Michael: A massive MIT study tracked 126,000 stories on Twitter and found that falsehoods spread six times faster than the truth. It wasn't bots doing the spreading. It was us. Kevin: Whoa, six times faster? And it's our fault? That feels… personal. It’s like we’re biologically programmed to hit the ‘share’ button on nonsense. Michael: It’s a deeply human problem, and that’s the central challenge Cass R. Sunstein tackles in his book, Liars: Falsehoods and Free Speech in an Age of Deception. Kevin: Cass Sunstein. I know that name. Isn't he the Nudge guy? The behavioral economics guru? Michael: The very same. And what makes him the perfect person to write this is that he's not just a top Harvard law professor; he was also the head of regulation for the White House under Obama. He’s seen how lies can cripple policy from the inside out. He’s not just an academic; he’s been in the trenches. Kevin: Okay, so if a guy like that, who has seen the damage firsthand, is writing about this, he must have a solution. If lies are so dangerous, why don't we just ban them? What's the problem? Michael: Ah, that is the billion-dollar question, and the answer is one of the most unsettling and important paradoxes in a free society. It’s the reason we have to protect speech we hate.

The Free Speech Paradox: Why We Must Protect Lies to Save the Truth

SECTION

Kevin: Unsettling paradoxes are my favorite kind. Lay it on me. Why do we have to protect obvious, harmful lies? Michael: Let me tell you a story. It’s about a man named Xavier Alvarez. In 2007, at a public water district meeting in California, he stood up to introduce himself. Kevin: A water district meeting. The pinnacle of high drama. Michael: You'd think. But Alvarez announced, "I'm a retired Marine of 25 years. I retired in 2001. Back in 1987, I was awarded the Congressional Medal of Honor." Kevin: Oh, no. Don't tell me. Michael: It was a complete lie. He'd never even served in the military, let alone won the nation's highest award for valor. And this wasn't a one-off thing. The guy was a serial liar. He’d claimed he was a professional hockey player, that he’d rescued the American ambassador during the Iranian hostage crisis. It was absurd. Kevin: That’s not just a lie; it’s an insult to everyone who has ever served. It’s stolen valor. There was even a law against it, right? The Stolen Valor Act. So they threw the book at him. Michael: They did. He was prosecuted under the Stolen Valor Act. But the case went all the way to the Supreme Court. And in United States v. Alvarez, the Court made a shocking decision. They ruled that his lie, as offensive as it was, was protected by the First Amendment. The law was struck down. Kevin: Hold on. He lied about the highest military honor, a lie that causes real pain and devalues the sacrifice of others, and the Supreme Court said that's okay? That feels fundamentally wrong. Michael: It feels wrong to almost everyone. But the Court’s reasoning is the bedrock of free speech in America. They worried that if the government gets the power to punish someone for telling a lie about themselves, it gets to be the arbiter of truth. It could create a "Ministry of Truth" straight out of Orwell's 1984. Kevin: The classic slippery slope argument. If they can punish this lie, they can punish any lie. Michael: Precisely. The justices argued that giving the government the power to compile a list of subjects you can't lie about has no clear limiting principle. What’s next? Lying about your college GPA? Lying about your political affiliations? The fear is that a government could use that power to punish dissenters. Imagine a president using a "truth law" to prosecute journalists who publish critical, but maybe slightly inaccurate, stories. Kevin: Right. So you protect the lie you hate to prevent the government from punishing a truth it hates. Michael: That's the paradox in a nutshell. It’s what the court calls creating "breathing space" for ideas. This goes back to another landmark case, New York Times v. Sullivan in 1964. The court ruled that for a public official to win a libel suit, they have to prove the publisher knew the statement was false or acted with "reckless disregard" for the truth. Kevin: They called it "actual malice." Michael: Exactly. The court acknowledged that "erroneous statement is inevitable in free debate." If you want a robust, fearless press and a public that can criticize its leaders, you have to tolerate some falsehoods. Otherwise, the fear of a lawsuit—what’s called a "chilling effect"—would cause everyone to self-censor. Important truths would go unspoken. Kevin: So we’re stuck. To protect the truth, we have to let lies run free, at least to a certain extent. I get the legal logic, but it still doesn't explain why we're so bad at handling it. Why do those lies spread so fast in the first place? Why are we so gullible?

The Human Glitch: Why Our Brains Are Wired to Believe Falsehoods

SECTION

Michael: That's the second piece of the puzzle Sunstein lays out, and it’s all about our psychology. He starts with a simple concept: the "truth bias." Kevin: Truth bias? What’s that? Michael: It’s our default setting as humans. We’re wired to believe that what we hear is true. It’s cognitively easier to accept information than to question it. Think about every conversation you have. You don't fact-check every sentence your friend says; you just assume they're being truthful. It's a social lubricant. Kevin: Oh, I know this. It's like when a car salesman tells you, "This is a fantastic deal, we're losing money on this one," and for a split second, a tiny, gullible part of your brain actually believes him. Michael: Sunstein tells a very similar story! He was buying a Toyota Camry, and the salesman said, "It's a slow Saturday, so I'll give you a big break." Sunstein bought the car. Later, he joked with the salesman, "Glad I could help you out on a slow day," and the salesman, forgetting his lie, just scoffed and said, "Saturday is always our best day." Kevin: That’s brilliant. And infuriating. But that’s a small, one-on-one lie. How does that scale up to the internet? Michael: It scales up through what social scientists call "cascades." There are two types. The first is an informational cascade. This happens when you see a lot of people saying or doing something, so you assume they must know something you don't. Kevin: Right, that's the restaurant with the long line. You get in line too, assuming the food must be great, even though you have zero actual information. You’re just trusting the crowd. Michael: Exactly. The second type is even more powerful: a reputational cascade. This is when you publicly agree with the crowd, not because you think they're right, but because you don't want to be the weirdo who disagrees. You value your social standing more than you value expressing your private doubts. Kevin: So you stay silent or even agree with something you know is wrong just to avoid being ostracized. That sounds like every family Thanksgiving dinner. Michael: And it's the engine of social media. When a falsehood starts spreading, the informational cascade makes it look credible, and the reputational cascade pressures people to join in or shut up. Then you add the final ingredient: group polarization. Kevin: Let me guess. When we get into groups with people who already agree with us, we all become more extreme in our beliefs. Michael: You nailed it. Sunstein cites studies showing that when a group of like-minded people discusses an issue, they don't just reinforce their beliefs; they end up at a more extreme version of their original position. Their confidence skyrockets, and their tolerance for opposing views plummets. Kevin: So our social media feeds are basically designed to be group polarization machines. They feed us what we already believe, surround us with people who agree, and make us more radical. That’s terrifying. Michael: It is. And it explains that MIT study. Falsehoods are often more novel and emotionally charged than the truth. They trigger surprise, fear, or disgust. So they get shared, the cascades kick in, the groups polarize around them, and suddenly a lie has lapped the truth six times over. Kevin: Okay, this is bleak. The law can't stop lies because it would create a Ministry of Truth, and our brains are basically petri dishes for growing them. This is where the conversation gets really thorny. If the 'marketplace of ideas' is broken and our brains are biased, what can we actually do?

Taming the Digital Hydra: Finding Solutions

SECTION

Michael: This is where Sunstein gets practical, but also, as some critics have pointed out, quite controversial. He argues that just because the government's hands are tied doesn't mean nothing can be done. But first, we have to accept that speech can cause profound, measurable harm. Kevin: You mean beyond just hurting someone's feelings or reputation. Michael: Far beyond. Sunstein highlights a stunning study from the early days of the COVID-19 pandemic. Researchers at the University of Chicago looked at the viewership of two Fox News shows: Tucker Carlson Tonight and Hannity. Kevin: Okay, two shows on the same network. What's the difference? Michael: In February 2020, Tucker Carlson was already taking the virus very seriously, warning his audience that a pandemic was coming. In contrast, Sean Hannity was downplaying it, comparing it to the flu and telling his viewers not to panic. He called it a hoax by the media to bludgeon the president. Kevin: I remember that. The split was really stark. Michael: The researchers compared counties where Hannity was more popular to counties where Carlson was more popular. The results were chilling. A 1% increase in Hannity viewership in a county, relative to Carlson, was associated with a 30% increase in COVID-19 cases by the end of March, and a 21% increase in deaths. Kevin: That's staggering. You can draw a direct line from a host's words to people dying. So we have proof that certain speech leads to catastrophic harm. What does Sunstein say we should do about that? Michael: He proposes a framework. The key is to look at two things: the speaker's state of mind and the magnitude of the harm. Did the speaker know they were lying? And is the harm severe and imminent? A lie told knowingly that will cause immediate, serious harm—like falsely shouting fire in a theater—is the easiest case for regulation. A statement made negligently that might cause diffuse harm later is the hardest. Kevin: So for the COVID example, it’s tricky. Hannity might argue he genuinely believed it wasn't a threat, so he wasn't technically "lying" in his own mind. Michael: Exactly. And that's why Sunstein argues the government should mostly stay out of it. Instead, he puts a huge amount of responsibility on private institutions. Kevin: You mean Facebook, YouTube, Twitter... Michael: Yes. He argues they have the right and the responsibility to act. They aren't bound by the First Amendment in the same way the government is. They can, and should, take down defamatory content. They should label misinformation. He’s particularly concerned about deepfakes—doctored videos—because our brains are so wired to believe what we see. He argues those can be regulated more aggressively than just words. Kevin: But isn't having Facebook or YouTube decide what's true just trading a government 'Ministry of Truth' for a corporate one? A Silicon Valley one? That feels just as dangerous. Michael: That is the core criticism of his argument, and Sunstein acknowledges it. It’s a trade-off. Do you trust the government, or do you trust Mark Zuckerberg? He argues that private platforms, for all their flaws, are more responsive to public pressure and market forces. He also proposes more modest legal ideas, like a "right to demand correction," where a person who has been lied about can force a publication or platform to issue a prominent retraction. Kevin: That seems reasonable. It's not censorship; it's just setting the record straight. It’s more speech, not less. Michael: That's the principle. The goal isn't to silence the lie but to amplify the truth and give it a fighting chance. It's about designing systems, both legal and social, that make it harder for lies to thrive and easier for truth to catch up.

Synthesis & Takeaways

SECTION

Kevin: So after all this, it feels like we're walking a tightrope over a canyon. On one side is a world overrun by destructive lies, and on the other is a world of censorship and state-controlled truth. Michael: That's the perfect image for it. And ultimately, Sunstein's argument is that we can't legislate our way to a 'truthful' society. The law can only step in at the edges, to prevent the most catastrophic harms—the libel that destroys a life, the fraud that empties a bank account, the health misinformation that kills. The real battle is cultural and personal. Kevin: Right. The takeaway isn't to wait for a law to fix our information diet. It’s about being aware of our own 'truth bias,' our tendency to fall into cascades, and the echo chambers we live in. Maybe the first step is just asking, 'Could I be wrong about this?' Michael: And to demand more from the platforms we use every day. Not necessarily censorship, but transparency, context, and mechanisms for correction. It’s about building a stronger immune system against falsehoods, both within ourselves and within our society. Kevin: A great place to end. It’s not a simple fix, but a call for a kind of intellectual and civic hygiene. Michael: A perfect way to put it. A question to leave our listeners with: Does protecting lies feel like a necessary evil to you, or a fatal flaw in our system? We'd love to hear what you think. Let us know your thoughts on our social channels. Michael: This is Aibrary, signing off.

00:00/00:00