
Data: The New Asbestos
12 minWhy and How You Should Take Back Control of Your Data
Golden Hook & Introduction
SECTION
Olivia: A study by Latanya Sweeney, a Harvard professor, found that 87% of Americans can be identified by just three pieces of data: their birth date, gender, and zip code. Three simple facts. That's how fragile our anonymity really is. Jackson: Wow, 87 percent… that’s basically everyone. That’s a terrifyingly low bar for being exposed. It feels like you could find that information on a birthday card someone threw away. Olivia: Exactly. And that's the reality at the heart of the book we're diving into today: Privacy Is Power by Carissa Véliz. It’s a book that argues we’ve been sleepwalking into a surveillance crisis. Jackson: I’ve heard this book is intense. Almost like a manifesto. Olivia: It is. And Véliz isn't just a journalist; she's an Associate Professor of Philosophy and Ethics at Oxford's Institute for Ethics in AI. She’s looking at this not just from a tech perspective, but from a deep, ethical one, which is probably why the book was named an Economist Book of the Year. She argues that our data isn't just being collected, it's being weaponized. Jackson: Okay, so where is all this data even coming from? I mean, beyond the obvious stuff like social media. I think most of us have a vague sense we're being tracked, but the sheer scale of it feels abstract. Olivia: That's the perfect question, because Véliz makes it anything but abstract. She paints a picture that is chillingly specific.
The Toxicity of Your Data Trail
SECTION
Olivia: She tells this story, a kind of "Smart Home Privacy Nightmare," that follows a person through a single, ordinary day. It starts the moment you wake up. You check your phone, and instantly, the manufacturer, your mobile company, and a dozen apps log that you're awake. Jackson: Right, that part I can believe. My phone definitely knows my sleep schedule better than I do. Olivia: But then it gets weirder. Your smartwatch has been tracking your every toss and turn, and it sends your heart rate data to your phone. Your smart electric toothbrush has an app that dings you for not brushing often enough. Jackson: Hold on, my toothbrush is judging my dental hygiene? Come on. That feels like a step too far. Is that a real thing? Olivia: It absolutely is. And it gets worse. You step outside and your neighbor’s smart doorbell, with its facial recognition camera, logs your face. That footage might even be reviewed by Amazon employees to train their AI. You get in your car, and it’s gathering data on your location, your speed, even your taste in music. By the time you get to work, dozens of entities have built a detailed profile of your morning routine without you ever clicking "I agree." Jackson: That’s… a lot. It feels less like convenience and more like being a lab rat in a digital cage you’ve paid to build around yourself. But what’s the actual harm? So what if my car knows I like 80s pop? Olivia: This is the core of her argument. The harm comes when that data is treated not as benign information, but as a toxic substance. Véliz uses the analogy of asbestos. For decades, we used asbestos everywhere. It was cheap, it was useful, and it was a miracle material. And all the while, it was silently poisoning people. She says personal data is the asbestos of our time. Jackson: That’s a powerful analogy. Asbestos caused cancer, a hidden, slow-moving disease. What’s the digital equivalent of that? Olivia: The digital equivalent is the complete ruin of a person's life. The most devastating example she gives is the Ashley Madison data breach. This was a dating site for people seeking extramarital affairs. In 2015, hackers released the entire customer database online—names, addresses, credit card numbers, personal preferences. Jackson: Oh, I remember that. It was brutal. Olivia: It was a catastrophe. People were extorted. They received emails saying, "Pay us in bitcoin, or we'll tell your spouse." Marriages ended. People lost their jobs. And tragically, several people committed suicide. Their data, which they thought was private, became a toxic weapon used against them. And Véliz’s point is that every piece of data we shed—from our location to our search history—has the potential to become just as toxic in the wrong hands. Jackson: So the "I have nothing to hide" argument is just naive. You might not have anything to hide today, but the data collected on you could be used against a future version of you in a context you can't even imagine yet. Olivia: Precisely. You’re not just giving away information. You’re giving away power over your own life. And that brings us to the real reason why everyone is so desperate to collect it.
Privacy as the New Currency of Power
SECTION
Jackson: Okay, I get that it's toxic for individuals, but why are companies so obsessed with collecting it? It can't just be for better-targeted ads for socks, right? The risk-reward seems way out of balance if it’s just about advertising. Olivia: That’s what they want you to think. Véliz argues that the ad-supported model was just the beginning. She traces it back to Google in the early 2000s. They were a great search engine, but they weren't making much money. They realized that the data users left behind—their search queries, the links they clicked—wasn't just "data exhaust." It was gold dust. Jackson: Because they could use it to figure out what people wanted and sell that insight to advertisers. Olivia: Exactly. They created AdWords and AdSense, and suddenly, user data became the fuel for a multi-billion-dollar industry. But here's the crucial pivot Véliz makes: the real product was never just ads. The real product is influence. It’s the ability to predict and shape human behavior. Jackson: That sounds like a line from a sci-fi movie. How does that actually work in practice? Olivia: The most famous example is Cambridge Analytica. They didn't just use Facebook data to show people ads for Donald Trump or Brexit. They built psychological profiles on millions of voters. They targeted anxious people with fear-mongering messages. They targeted traditionalists with messages about heritage. They weren't just selling a product; they were selling a customized reality, designed to nudge behavior on a massive scale. That’s a form of soft power—manipulation and seduction. Jackson: Right, using our own personalities against us. But that’s a private company that ultimately went bankrupt. What happens when a government decides to use this kind of power? That feels like the next logical, and much scarier, step. Olivia: And that is where the argument becomes truly chilling. Véliz points to China's Social Credit System as the ultimate expression of data as hard power. It’s a system where every piece of data about a citizen—what they buy, who their friends are, whether they jaywalk—is fed into an algorithm that calculates a "trustworthiness" score. Jackson: A trustworthiness score? Like a credit score for your entire life? Olivia: Exactly. If your score is high, you get perks: discounts on loans, shorter hospital waiting times. But if your score is low—if you play too many video games, associate with other low-scorers, or post dissenting opinions online—you are punished. You can be banned from buying plane or train tickets. Your kids might be barred from attending certain schools. Your face can be displayed on public billboards as a "discredited citizen." Jackson: Whoa. That’s not nudging. That’s direct, algorithmic social control. It’s taking the business model of Google and turning it into a tool of state-enforced obedience. Olivia: It's the ultimate fulfillment of the book's title. Privacy is power. When you lose your privacy, you don't just lose secrets; you lose autonomy. You lose the ability to make choices without being judged, punished, or manipulated by an invisible system. You hand over the power to shape your life to a corporation or a government. Jackson: This feels huge. It feels almost insurmountable. What can one person even do against giants like Google or an entire government armed with this technology? It’s easy to feel completely hopeless. Olivia: It is, and Véliz acknowledges that feeling of helplessness. But she argues that this is precisely why we have to shift our thinking from individual convenience to collective resistance. And that resistance can start in very small, very clever ways.
Pulling the Plug: From Personal Resistance to Collective Action
SECTION
Olivia: The final part of the book is a call to action, but it’s not just a list of privacy apps to download. It’s a philosophical shift. She argues that we need to start treating our privacy as something to be actively defended, not passively surrendered. Jackson: So what does that look like in the real world? Beyond just using a different web browser. Olivia: One of my favorite examples she gives is a technique called "obfuscation." It’s the deliberate act of creating confusing or misleading data to interfere with surveillance. She tells this story about teenagers who were worried about colleges and employers snooping on their social media. Jackson: A very valid fear these days. What did they do? Olivia: Instead of deleting their accounts, a group of them started sharing a single Instagram account. They would all post to it. It became impossible for an outsider to tell which post belonged to which person. They created noise. They polluted the data stream, making it useless for surveillance. Jackson: That’s brilliant! So it's about being clever, not just giving up. It’s a small act of digital rebellion. I love that. It feels empowering, not just defensive. Olivia: Exactly. It’s about reclaiming a little bit of agency. But she stresses that individual actions, while important, aren't enough. The real fight is collective. She compares it to the environmental movement. One person recycling is good, but it won't solve climate change. You need systemic change, you need regulations, you need a cultural shift where we all agree that polluting is wrong. Jackson: So my decision to share a photo of my kid online isn't just about my family's privacy. It's contributing to a larger system that normalizes surveillance for everyone. Olivia: That’s the heart of it. Your data is never just about you. It implicates your friends, your family, your community. When you give your DNA to a genealogy site, you’re also giving away information about all your relatives, who never consented. Privacy, she argues, is a team sport. Jackson: That completely reframes it. It’s not a personal preference, like choosing a brand of coffee. It’s a civic duty. Olivia: It is. And that’s why she ends the book by painting a stark picture of two possible futures, forcing us to choose which one we want to build.
Synthesis & Takeaways
SECTION
Olivia: On one path, she describes the surveillance society we’ve been talking about. A world where your emotional reactions to news articles are monitored, where algorithms decide if you’re worthy of a job or a loan, where every mistake you make is permanently etched into your data profile. It's a world optimized for control, not for human flourishing. Jackson: A world without second chances, really. Where there's no room for error, for growth, for just being human. Olivia: Exactly. But then she paints the other path. A world where privacy is respected as a fundamental right. Where you can have a private conversation without fearing it's being recorded. Where you can explore new or controversial ideas online without being flagged. Where you can make mistakes, learn from them, and move on. A world where technology serves us, not the other way around. Jackson: And her point is that we're at the crossroads right now. We have to actively choose that second path, or we will passively slide into the first. Olivia: Yes. She says widespread surveillance is fundamentally incompatible with a free, democratic society. It has to go. And that choice isn't up to tech CEOs or politicians alone. It's up to us. It starts with our own actions and our own demands. Jackson: It makes you think... what small act of digital rebellion can you commit this week? It doesn't have to be a grand gesture. Olivia: Maybe it's as simple as questioning one app's permissions before you click 'accept'. Or having a conversation with a friend about this very topic. That's where it starts. Véliz leaves us with this powerful thought: "You are not a product to be turned into data and fed to predators for a price. You are not for sale. You are a citizen, and you are owed privacy. It’s your right." Jackson: A powerful and necessary reminder. Olivia: This is Aibrary, signing off.