Aibrary Logo
Podcast thumbnail

The Digital Truth Serum

13 min

Golden Hook & Introduction

SECTION

Michelle: Mark, I have a simple question for you. How many condoms do you think are sold in the U.S. each year? Mark: Oh, wow. Uh... based on what people say? Billions, right? At least a billion and a half. That feels like the number you hear in health class. Michelle: That's what surveys suggest. The reality? Fewer than 600 million. That massive, nearly one-billion-condom gap is where our story begins today. Mark: Wait, what? That's a billion-condom lie! What is going on? Who is lying, and why? Michelle: Exactly! That's the central question in Seth Stephens-Davidowitz's book, Everybody Lies. And what's fascinating is that the author isn't a psychologist; he's a Harvard-trained economist and a former Google data scientist. He realized that while we lie to surveys, to our doctors, even to ourselves... we tell Google everything. Mark: The search bar as a confessional. I love it. It’s where you ask the questions you’re too embarrassed to ask another human being. Michelle: It’s our digital truth serum. And Stephens-Davidowitz argues it’s the most important dataset ever collected on the human psyche. It reveals a hidden, often shocking, version of who we really are.

The Digital Truth Serum: Uncovering Our Hidden Selves

SECTION

Mark: Okay, the condom example is a great start. But give me another one. Where else does this digital truth serum expose a major lie we tell ourselves as a society? Michelle: How about a presidential election? Let's go back to 2016. All the polls, all the experts, were saying Donald Trump had very little chance of winning. The conventional wisdom was that his controversial statements would sink him. Mark: I remember that vividly. It felt like a foregone conclusion. The data seemed clear. Michelle: The survey data seemed clear. But Stephens-Davidowitz was looking at a different dataset: Google searches. And he saw something disturbing. He found that the most predictive search term for Trump's support in a given area wasn't "jobs" or "immigration." It was racist searches, specifically those including the n-word. Mark: Whoa. Hold on. You’re saying that the number of people typing that word into Google was a better predictor of votes for Trump than traditional polling? Michelle: In many areas, yes. The data showed a direct correlation. In places where those hateful searches were more common, Trump consistently outperformed his polls. It was a measure of a hidden "darkness and hatred," as the author calls it, that people would never, ever admit to a pollster on the phone. But they would anonymously ask Google. Mark: That is deeply unsettling. It’s like lifting up a rock and seeing what’s squirming underneath. It makes you question what other societal beliefs are just a thin veneer over a much uglier reality. Michelle: It does. But the truth serum doesn't only reveal darkness. It also reveals our hidden vulnerabilities and private pains. Take marriage, for instance. Mark: Ah, another topic people are famously honest about. Michelle: Precisely. If you look at surveys, married couples report having sex about once a week. But if you look at Google, searches for "sexless marriage" are 3.5 times more common than searches for "unhappy marriage" and a staggering 8 times more common than "loveless marriage." Mark: Wow. So the biggest silent problem in marriages isn't a lack of love or happiness, but a lack of intimacy? That's a secret pain millions of people are carrying, and they're only comfortable admitting it to a blank search bar. Michelle: It’s a perfect example of the digital truth serum at work. People curate their lives on social media to look perfect, they tell surveyors what they think is the "normal" answer, but they tell Google the truth about their fears, their insecurities, and their deepest desires. Mark: It’s like we have two selves: the public-facing, curated self, and the Google-search self. And this book is basically the first real biography of that second, more honest self. Michelle: That’s a great way to put it. And it’s not just about what we type. The book argues that the real revolution is in reimagining what data even is.

Reimagining Data: From Horse Hearts to Human Smiles

SECTION

Mark: Okay, so search data is one thing. It's words, it's text. But the book argues we can find data in even weirder places, right? This isn't just about Google. Michelle: Not at all. This is where the book gets really creative and, honestly, feels like a real-life adventure story. It's about seeing the world with new eyes and realizing that data is everywhere, if you know how to look. The best example is the incredible story of the racehorse, American Pharoah. Mark: The one that won the Triple Crown! I know the name, but I don't know the data story behind it. Michelle: Well, get ready, because it’s amazing. The world of horse racing has always been driven by intuition, by looking at a horse's bloodline, its physique, how it runs. It’s an art, not a science. But a man named Jeff Seder, a data scientist, decided to challenge that. He went to horse auctions with a portable ultrasound machine. Mark: An ultrasound? What was he looking for? Michelle: He was measuring things no one else was. He measured spleen size, the composition of muscle fibers, and dozens of other internal metrics. He built a massive dataset correlating these hidden biological markers with actual race performance. And he found one variable that predicted success better than anything else. Mark: Let me guess... it wasn't the horse's star sign. Michelle: Not quite. It was the size of its left ventricle. The heart's main pumping chamber. A larger, more powerful left ventricle meant more oxygenated blood could get to the muscles during a race. It was a massive predictor of stamina and speed. Mark: That's incredible. So it's like a biological engine spec. Michelle: Exactly. So, flash forward to a major horse auction in 2013. A wealthy owner named Ahmed Zayat has hired Seder to evaluate horses. Seder's team analyzes all the horses for sale and comes back with a shocking recommendation. They tell Zayat not to buy a single one. Instead, they give him a desperate plea: "You absolutely, positively cannot sell horse number 85." Mark: And horse number 85 was Zayat's own horse that he was planning to sell? Michelle: Yes! Seder’s data showed that this horse had an exceptionally large and powerful left ventricle. It was, according to his model, a potential superstar. Based on this one, bizarre data point, Zayat did something almost unheard of: he bought back his own horse. Mark: And let me guess... horse number 85 was American Pharoah. Michelle: That's the one. Eighteen months later, he became the first horse in over three decades to win the Triple Crown, earning millions. It’s the perfect illustration of the book's point: find a field where the old methods are lousy, like horse scouting, and revolutionize it by finding a new, unconventional type of data. Mark: It’s like a real-life Moneyball, but for horse racing, and the weird stat isn't on-base percentage, it's... heart size? That's wild. So what other weird data sources are out there? Michelle: The book is full of them! Scientists have analyzed decades of high school yearbook photos to show how the act of smiling in pictures evolved over the 20th century—it wasn't always a thing! Others use the brightness of nighttime satellite photos to measure economic growth in developing countries where official data is unreliable. It’s about being creative. Words, pictures, bodies, light—it can all be data. Mark: This is all fascinating, but it's also a bit terrifying. If we can measure everything, and this data gives us so much power to predict things... what are the limits? What shouldn't we do with this power? Michelle: And that is the crucial third part of the book. The power is immense, but the perils are just as real.

The Powers and Perils of Big Data

SECTION

Mark: Right, because this isn't just about winning horse races or understanding history. This power can be used on us. So what are the big ethical tripwires here? Michelle: The author breaks the power of Big Data into a few categories, but one of the most impactful is A/B testing. It's the engine of the modern internet. It’s why websites are so good at getting us to click, to buy, to stay. Mark: That’s when they show different versions of a webpage to different users to see which one performs better, right? Michelle: Exactly. And the results can be stunning. During Obama's 2008 campaign, his team A/B tested the main button on their website. They tried different phrases like "Sign Up" or "Join Us." The winning phrase was "Learn More." That one small change was estimated to have raised an additional $60 million in donations. Mark: Sixty million dollars from changing one button's text? That's an insane amount of influence. But that feels like the positive side of it. What's the dark side of A/B testing? Michelle: The dark side is that the same technique can be used to make platforms pathologically addictive. A former Google design ethicist is quoted in the book saying, "There are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have." They A/B test colors, notifications, and layouts to find the perfect combination that keeps you scrolling, that hijacks your attention. It's why "Facebook" is one of the top-ten addictions people search for on Google. Mark: So the same tool that can optimize a political campaign for good can also optimize a social network for addiction. That's a heavy thought. But what about predicting something really serious, like a crime? Michelle: This is where we get to the most profound ethical dilemma in the book. The author tells the story of a 22-year-old man named James Stoneham. Over three weeks, he made a series of Google searches: "how to murder somebody," "murder law," and searches for his ex-girlfriend, Adriana Donato. Then, he invited her for a drive and murdered her. Mark: Oh my god. That's horrifying. Hold on. Should Google have reported him? This opens a Pandora's box of pre-crime. It touches on the big criticisms I've read about this kind of thinking—that it's a step away from a surveillance state. Michelle: It's the ultimate question. The author argues that targeting individuals based on their searches is ethically fraught and dangerous. For every one person like Stoneham, there are thousands of people who search for dark things out of morbid curiosity, or because they're a crime writer, or for a million other reasons. The number of false positives would be overwhelming. Mark: So we'd be arresting innocent people based on their thoughts. It's literally thought-crime. Michelle: Exactly. But the book proposes a different path. We can use the data at an aggregate level. If a city sees a sudden, sharp spike in searches for "kill Muslims," the police department could be alerted. Not to arrest individuals, but to increase patrols in Muslim neighborhoods or near mosques. It's about using the data to allocate resources to prevent crime, not to punish thoughts. Mark: That’s a fine line to walk. It’s the difference between using a weather forecast to prepare for a storm versus arresting someone for thinking about rain. But it relies on corporations and governments using this power responsibly, which... let's be honest, is a huge 'if'. Michelle: It is. The book also tells the story of how casinos use data to find a gambler's "pain point"—the exact amount of money they can lose before they get scared and stop coming back. They use data to keep you right on the edge of ruin, but not over it, to maximize their profit. It's a chilling use of this predictive power.

Synthesis & Takeaways

SECTION

Michelle: And I think that brings us to the core of it all. The book's real message isn't just that "everybody lies." It's that for the first time in human history, we have a tool that can cut through the lies and show us the world as it actually is, not just as we present it or wish it to be. Mark: It’s a mirror that doesn't flatter. It shows us our hidden racism, our secret anxieties, our unfulfilled desires. And that's both a gift and a huge responsibility. The data itself isn't good or bad; it's what we do with it that matters. Michelle: Precisely. This new, raw honesty can be used to build more addictive slot machines, or it can be used to identify where child abuse is spiking during a recession when official reports are failing. It can be used to sell us more stuff, or it can be used to understand which educational programs are actually helping kids learn. The choice is ours. Mark: The book is incredibly optimistic about the potential for data to revolutionize social science and improve our lives, but it’s also a stark warning. It’s like we’ve been given a superpower, and we’re still figuring out the instruction manual. Michelle: And the author himself ends on a wonderfully data-driven, self-aware note. He cites a study showing that most readers of dense non-fiction books never actually finish them. So, he says, he's not going to labor over a grand conclusion. Mark: He literally says, "Too few of you, Big Data tells me, are still reading." That's hilarious. He's practicing what he preaches right to the very last sentence. Michelle: It's the perfect ending. So the data shows us our collective blind spots, our hidden prejudices, our secret pains. The real question this book leaves me with is... now that we know, what are we going to do about it? Michelle: A profound question to end on. This is Aibrary, signing off.

00:00/00:00