Aibrary Logo
Podcast thumbnail

The Sticky Data Trap

11 min

Human Rights in the Digital Age

Golden Hook & Introduction

SECTION

Olivia: Here’s a wild stat, Jackson: Amazon’s Ring partners with over 1,800 US law enforcement agencies. Jackson: Wow. Okay, that's a lot. Sounds like a pretty effective way to deal with all those porch pirates. Olivia: You’d think so. But the most chilling story about it involves a high school math teacher who, on a simple ten-minute walk to the grocery store, realized fifty different doorbells were filming her. That single realization started a war in her own home. Jackson: A war? Over a doorbell? That sounds intense. You can’t just say that and not explain. Olivia: I won't! This exact tension is at the heart of Wendy H. Wong's incredible book, We, the Data: Human Rights in the Digital Age. It’s one of those books that completely reframes how you see the world. Jackson: Wendy H. Wong. I’m curious, is she a tech CEO? A Silicon Valley insider? Olivia: That’s what makes this book so powerful. She’s not. Wendy H. Wong is a distinguished political scientist, a professor who specializes in human rights and global governance. She’s looking at this not as a technical problem, but as a human problem. And her perspective is so sharp that the book was a finalist for the Lionel Gelber Prize, a major award for books on international affairs. Jackson: Okay, so this isn't your typical "tech will save us" or "tech will doom us" narrative. This is something different. Olivia: Exactly. It's about what it means to be human when every part of our lives is being turned into data.

The Invisible Chains: How 'Sticky Data' Erodes Our Fundamental Rights

SECTION

Olivia: Wong kicks things off with a fictional but deeply relatable family she calls the "Madeups." They buy a Ring doorbell for convenience and security, just like millions of other people. The dad, Jason, loves it. He feels safer, he's catching porch pirates on the neighborhood app, he feels like a good citizen. Jackson: I get that. If my packages are getting stolen, I'm all for more cameras. Isn't Jason just being practical here? Olivia: He thinks he is. But then his wife, Claire—the math teacher—has this epiphany. While walking her usual route, she starts counting the video doorbells. She counts fifty. Fifty cameras recording her, her kids, the mail carrier, people walking to the local clinic. She realizes they’ve all opted into creating what the book calls "the largest civilian surveillance network the US has ever seen." Jackson: Huh. When you put it like that, it does sound a little creepy. It’s not just your camera; it’s everyone’s camera. Olivia: Precisely. For Claire, it stops feeling like security and starts feeling like oppression. The convenience of seeing who's at the door suddenly comes at the cost of her sense of community and freedom. She feels watched, constantly. The argument that erupts between her and Jason is really a debate our entire society is having: where is the line between safety and surveillance? Jackson: And there’s no easy answer to that. Olivia: There isn't. And Wong argues the problem is even deeper than we think because of a concept she calls "data stickiness." Data isn't just collected; it sticks to you. It’s permanent, it’s linked across different systems, and it's often about the most mundane parts of your life. Jackson: What do you mean, "sticks to you"? Olivia: Let me give you a real, heartbreaking example from the book. A single mother named Rafaela Aldaco. She had a minor battery conviction from when she was eighteen, but she did her community service, and the record was officially expunged. Wiped clean. She was trying to get a fresh start and was accepted into a housing program. Jackson: That’s great. A second chance. Olivia: But when she went to get the key, they told her the offer was rescinded. A private tenant screening company called RentGrow had run a background check. And in their database, her expunged conviction was still there. It had "stuck" to her. She lost the apartment, and when she sued, the courts sided with the company. Jackson: Wait, even though it was legally expunged? That’s horrifying. It’s like digital glitter. Once it’s out there, you can never, ever get rid of it all, and it just follows you forever, showing up in the worst possible places. Olivia: That is the perfect analogy. Digital glitter. And that’s the essence of "sticky data." It undermines our autonomy—our ability to move on and make new choices—and our dignity. Rafaela did everything right to clear her name, but the data outlived the legal truth.

The 'Right to Be Forgotten' Paradox

SECTION

Jackson: But hold on, what about deleting it? I’ve heard about the "right to be forgotten" in Europe. Can’t you just tell Google to get rid of the glitter? Olivia: Ah, you’ve just walked right into the book's next brilliant, and deeply ironic, point. The "right to be forgotten" is a fascinating case study in why individual control is a flawed dream. Wong tells the story of the man who started it all, a Spaniard named Mario Costeja González. Jackson: The hero who fought Google and won? Olivia: The very same. Years ago, he had some financial trouble, and an ad for his property auction was published in a newspaper. Decades later, that old, irrelevant, and reputation-damaging ad was still the top Google result for his name. So he sued. And in 2014, the European Court of Justice ruled in his favor, establishing this landmark "right to be forgotten." Jackson: A huge win for the little guy! Olivia: It was! He even said, "If Google was good before, now it’s perfect." But here’s the paradox. The case made him so famous, so newsworthy, that he became a figure of "public interest." And because of that, the Spanish Data Protection Authority later determined that he himself was no longer eligible for the very right he had created. The information about his case was now too important to be forgotten. Jackson: You have got to be kidding me. That is the ultimate Catch-22. He fought for the right to be forgotten and in the process made himself unforgettable. Olivia: Exactly! It perfectly illustrates Wong's argument. Data isn't a simple piece of property you can own or delete. It’s co-created and it's social. Your data is tangled up with other people, with public records, with corporate interests. Jackson: So if I can't really own my data, who does? Is it just floating out there in the ether for anyone to grab? Olivia: That’s the billion-dollar question. And Wong uses another powerful analogy to explain why "ownership" is the wrong way to think about it: DNA. Think about it. Your DNA is uniquely yours, right? Jackson: I would think so. Olivia: But it also contains information about your parents, your siblings, your distant cousins. When you send a spit sample to a genealogy website, you’re not just sharing your own data. You’re sharing a map of your entire family tree. This is exactly how investigators caught the Golden State Killer. They used DNA from a crime scene and uploaded it to a public genealogy site. They didn't find him; they found his third or fourth cousins, who had voluntarily uploaded their own DNA. By building out the family tree from those relatives, they zeroed in on him. Jackson: Wow. So his relatives, without even knowing it, led the police right to his door. Their data wasn't just their own. Olivia: It was collective. It was co-created. And that’s Wong’s point. Just like with DNA, your digital data—your posts, your location history, your friend network—implicates and reveals things about other people. You can't neatly draw a line around it and say "this part is mine." It’s fundamentally shared.

We, the Stakeholders: The Power of Data Literacy and Collective Action

SECTION

Jackson: Okay, I’ll be honest, this is starting to feel a bit hopeless. We’re all caught in this giant web, our data is sticky digital glitter, and we can't even control it. Are we just supposed to accept being cogs in this massive data machine? What does Wong suggest we actually do? Olivia: This is where the book becomes incredibly empowering. Wong argues that the solution isn't to log off and go live in a cabin. The path forward is to change our entire mindset. She makes a bold claim: data literacy is a fundamental human right for the 21st century. Jackson: Data literacy? You mean like, learning to code? Olivia: Not at all. It’s much broader. It’s the ability to "read, work with, analyze, and argue with data." It’s understanding how these systems work, who is collecting your data, and why. It’s being able to spot the manipulation in that phishing email or question the fairness of a tenant-screening algorithm. It’s critical thinking for the digital age. Jackson: That makes sense. But how do you teach that to an entire population? Olivia: Here’s her most surprising, and I think brilliant, proposal. She says we need to look to one of our oldest public institutions: the library. Jackson: Libraries? With all due respect to librarians, that sounds a little... quaint. Aren't they just quiet buildings with books? Olivia: That’s the common misconception! Wong argues libraries are the perfect place. They are trusted, non-commercial community hubs. Librarians are literally professional experts in organizing information, vetting sources, and protecting intellectual freedom. They are already on the front lines, helping people apply for jobs online, use digital tools, and navigate a complex information world. They are the natural teachers for data literacy. Jackson: I’ve never thought of it that way. A librarian as a digital self-defense instructor. I kind of love that. Olivia: It reframes everything. And it leads to her final point. Data literacy isn't just for individual protection; it's the foundation for collective action. Once we understand the system, we can start to change it. She floats ideas like "data unions." Jackson: Like a labor union, but for our data? Olivia: Exactly. Instead of each of us clicking "agree" on a thousand pages of terms and conditions we'll never read, a data union could bargain on behalf of thousands or millions of members. They could negotiate for better privacy terms, for a share of the profits, or even to prevent certain types of data from being collected in the first place.

Synthesis & Takeaways

SECTION

Jackson: So the message isn't "delete your apps and go live in a cabin." It's that we've been thinking about this all wrong. This isn't an individual tech problem we can solve with better privacy settings. It's a collective human rights challenge. Olivia: That's the core of it. Wong’s ultimate call to action is for us to shift our identity. We are not just "users" or "consumers" or "data subjects." We are, as the title says, We, the Data. We are stakeholders, and we need to start acting like it. We have a right to a seat at the table where the rules of our digital world are being written. Jackson: It’s a powerful shift in perspective. From passive product to active participant. Olivia: It is. And it leaves you with a really practical question to ponder. Wong’s work makes you wonder, what's one small way you can start acting like a stakeholder, not just a subject, in your own digital life this week? Maybe it's questioning the permissions an app is asking for, or starting a conversation with your family about what you share online. Jackson: It’s about taking that first step from being data to being a person who has data. A person with rights. Olivia: Exactly. It's about remembering the human in human rights. Jackson: This is Aibrary, signing off.

00:00/00:00