
The Face AI Couldn't See
15 minMy Mission to Protect What Is Human in a World
Golden Hook & Introduction
SECTION
Olivia: Jackson, I have a number for you: 34.4 percent. Jackson: Okay... a very specific number. Is this my chance of winning the lottery? Or maybe the percentage of my brain I use on a Monday morning? Olivia: Not quite. It's the accuracy gap a major tech company's AI had between identifying a white man's face and a Black woman's face. One system was nearly perfect for one group, and failed a third of the time for another. That's the world Joy Buolamwini unmasks. Jackson: Wow. Okay, so this isn't some theoretical problem. This is happening in products we might be using. And the book is Unmasking AI: My Mission to Protect What Is Human in a World of Machines by Joy Buolamwini, right? Olivia: Exactly. And Joy isn't just anyone. She's a Rhodes Scholar, an MIT researcher, and the founder of the Algorithmic Justice League. Her journey, which we'll get into, is so compelling it became the basis for the Emmy-nominated documentary Coded Bias. Jackson: That’s some serious credibility. It feels like this is one of those books that doesn't just comment on the world, it actively changes it. Olivia: It really does. And it all starts not with a grand plan, but with a simple, frustrating experience. Have you ever had a piece of tech just... refuse to work for you? Like an automatic sink that won't turn on or a voice assistant that never understands your name? Jackson: Oh, constantly. I'm convinced my smart speaker thinks my name is 'Jaxon' and that I have a deep, abiding interest in 18th-century basket weaving, no matter how many times I correct it. Olivia: Well, for Joy Buolamwini, that frustration became a life-changing discovery.
The Accidental Activist: Discovering the 'Coded Gaze'
SECTION
Olivia: Her story as an activist really begins when she's an undergraduate at Georgia Tech. She's working on a social robot named Simon, and her project is to get Simon to play peekaboo. For that to work, the robot's camera needs to detect a human face. Jackson: Seems simple enough. Face detection has been around for ages. Olivia: You'd think so. But Simon consistently struggled to detect her face. She'd turn on all the lights, tilt her head, nothing. The little green box that was supposed to pop up around her face just wouldn't appear. Then, in desperation, she asks her roommate to try. Jackson: And let me guess, it worked perfectly for the roommate? Olivia: Instantly. Her roommate was a fair-skinned woman with red hair. The software had no problem seeing her at all. At the time, Joy just brushed it off, thinking it was a fluke, maybe like those old cameras that would only capture her eyes and teeth in photos. Jackson: I can see how you'd rationalize that. You'd assume it's a bug or bad lighting before you'd assume the technology itself is biased. Olivia: Exactly. But then it happened again. A few years later, she's in Hong Kong and sees another social robot, this one called Autom. She volunteers for the demo, and again, the robot can't detect her face. She talks to the creator and finds out it's using the exact same underlying software. Jackson: Whoa. Okay, so now it's a pattern. Once is an incident, twice is a coincidence, but a third time... Olivia: The third time is the real kicker. She's now a graduate student at her dream school, MIT. She's working on a creative project called 'Upbeat Walls,' where you can 'paint' on a digital screen with your smile. And again, the face-tracking software works great for her lighter-skinned classmates but struggles with her. This is where she decides she can't ignore it anymore. Jackson: This is the moment where she has to figure out what's really going on. What did she do? Olivia: She sets up a simple experiment. She's sitting in front of her computer, camera on. The software can't see her. Then, she does something that has become iconic. She holds up a plain, white, featureless mask to her face. Jackson: And... Olivia: The little green box appears instantly. The software saw the mask as a face, but it couldn't see the actual human face right behind it. Jackson: That is absolutely chilling. It's not just that the machine is broken; it's broken in a very specific, very telling way. It's like it was taught that 'face' means 'white face.' Olivia: You've just hit on the core of her discovery. She coined a term for it: the "coded gaze." It’s the idea that the preferences, priorities, and prejudices of the people who build technology are encoded into the systems themselves. She found that many of the 'gold standard' datasets used to train these AIs were what she calls "pale male datasets"—overwhelmingly made up of images of white men. Jackson: It's like the old Shirley Cards in photography, right? Where film was calibrated for white skin, so it did a terrible job capturing darker skin tones. Olivia: That's the perfect analogy, and she uses it in the book. The default is not neutral. The default reflects who is in power at the time of creation. For film, it was white skin. For these early AI datasets, it was white, male faces. The machine isn't 'racist' in a human sense; it's just reflecting the biased world it was taught from. Jackson: So her personal frustration, this thing that kept happening to her, was actually a clue to a massive, systemic problem baked into the foundations of modern AI. Olivia: Precisely. And once she had this visceral proof—this image of herself holding a white mask to be seen by a machine—she knew she couldn't just write a technical paper about it. She had to show the world. But that meant taking on the biggest companies on the planet.
Speaking Truth to Power: The 'Gender Shades' Audit and Corporate Confrontation
SECTION
Jackson: Okay, so she has this incredible, visceral proof with the white mask. But how do you go from a personal discovery to proving this is a global problem? A single anecdote, even a powerful one, can be dismissed by a big company. Olivia: You're right. She needed data. Hard, undeniable numbers. This led to her groundbreaking master's thesis, the "Gender Shades" study. She and her research intern, a brilliant student named Deborah Raji, decided to audit the gender classification services of major tech companies. Jackson: How do you even audit a black-box AI from a company like Microsoft or IBM? Olivia: It was ingenious. She couldn't look at their code, but she could test their publicly available products. She built her own dataset, the Pilot Parliaments Benchmark. Instead of scraping the internet, she used photos of parliament members from countries with a good gender balance and a range of skin tones. It was a much more balanced and representative dataset than the industry standards. Jackson: Smart. So she's building a better yardstick to measure their systems against. What did she find? Olivia: The results were staggering. She tested systems from IBM, Microsoft, and a Chinese company, Face++. On the surface, their overall accuracy looked pretty good, mostly in the 90% range. But when she broke it down by intersectional groups, the story changed dramatically. All the systems were better at identifying men than women. All of them were better at identifying lighter-skinned faces than darker-skinned faces. Jackson: And the worst-case scenario was... Olivia: Dark-skinned women. For the darkest-skinned group of women, the error rates were as high as 35% for Microsoft and IBM, and nearly 47% for Face++. Remember that 34.4% number I started with? That was the gap between IBM’s accuracy for light-skinned men, which was nearly 100%, and its accuracy for dark-skinned women, which was just 65%. Jackson: A 34-point gap. That's not a glitch; that's a fundamentally broken product for an entire demographic. This is where it gets dramatic, right? I've read that Amazon's reaction to her follow-up research was... not great. Olivia: 'Not great' is an understatement. In a follow-up study called "Actionable Auditing," they included Amazon's Rekognition system, which was being sold to police departments. The study found Amazon's system had the worst performance of the bunch. Before the New York Times could even publish the story, Amazon went on the offensive. Jackson: What did they do? Olivia: Their VP of AI, a Dr. Matt Wood, published a blog post calling the research "misleading" and "deceptive." They tried to discredit Joy and Deborah, claiming they used the tool incorrectly and that their findings didn't apply to the real-world scenarios police would use it for. It was a classic corporate playbook: deny, deflect, and attack the messenger. Jackson: That must have been terrifying. You're a graduate student, and one of the most powerful companies in the world is publicly trying to ruin your reputation. Olivia: It was immense pressure. But this is where the story becomes about collective action. Joy wrote a counter-article on Medium, anticipating and dismantling every one of Amazon's arguments. And then the community rallied. The ACLU and Georgetown Law's Center on Privacy and Technology issued statements of support. And, most incredibly, 75 leading AI researchers—including a Turing Award winner, which is like the Nobel Prize of computing—signed a letter defending the research and calling on Amazon to stop selling facial recognition to police. Jackson: Wow. So the academic community put their careers on the line to back her up. Olivia: They did. And it worked. The public pressure was immense. And the ultimate vindication came later when a major study by the National Institute of Standards and Technology, a U.S. government agency, confirmed what Joy had been saying all along: facial recognition systems across the board showed high rates of false positives for Asian and African American faces. The science was settled. Jackson: It’s a true David vs. Goliath story. She didn't just win the argument; she forced an entire industry to confront its own failures. Olivia: And that victory showed that data is powerful. But Joy also realized that data and academic papers have their limits. To truly change the world, you need to change hearts and minds. You need to tell a story.
The Poet of Code: Wielding Art and Policy as Weapons for Justice
SECTION
Jackson: That's a fascinating pivot. She wins this huge scientific battle, and her response is to become an artist? Olivia: In a way, yes. She's always seen herself as a blend of art and science, calling herself the "Poet of Code." She knew a research paper, no matter how groundbreaking, wouldn't be read by the policymakers and everyday people whose lives were being affected. She needed to humanize the harm. Jackson: So how do you do that? How do you make a dataset feel personal? Olivia: She created a project called "AI, Ain't I a Woman?"—a direct nod to the abolitionist Sojourner Truth. She and Deborah Raji tested commercial AI systems on the faces of iconic Black women: Michelle Obama, Oprah Winfrey, Serena Williams, Ida B. Wells. The results were, frankly, insulting. Jackson: What kind of results? Olivia: The AI labeled Serena Williams as male. It couldn't detect a face on a young Oprah. It labeled a photo of Sojourner Truth as a 'gentleman.' It even put a 'toupee' label on a picture of Michelle Obama. Jackson: That's not just inaccurate; it's deeply offensive. It's layering technological bias on top of centuries of social and racial bias. Olivia: Exactly. And instead of just publishing the results, Joy wrote and performed a spoken-word poem. She filmed herself reciting it, intercutting the footage with these shocking, offensive AI labels appearing over the faces of these revered women. It takes the abstract idea of 'bias' and makes it visceral. It makes you feel the indignity of being mislabeled and unseen. Jackson: That's such a brilliant move. It takes it from a spreadsheet to something that makes you feel the injustice. It's what she calls an 'evocative audit,' right? A demonstration that evokes an emotional response. Olivia: Precisely. She drew inspiration from figures like Frederick Douglass, who was the most photographed man of the 19th century. He used the new technology of photography as a 'counter-demo' to combat the racist caricatures of Black people. Joy was doing the same thing with AI—using the technology against itself to reveal a deeper truth. Jackson: And this artistic approach is what really blew the issue into the mainstream, isn't it? This is what leads to the Coded Bias documentary and her work in policy. Olivia: It was the catalyst. The "AI, Ain't I a Woman?" video went viral. It led to the documentary, which reached over 100 million people on Netflix. It got her an invitation to testify before the U.S. Congress, where she explained the 'pale male dataset' problem to lawmakers. She shared the story of Robert Williams, a Black man in Detroit who was falsely arrested in front of his children because of a faulty facial recognition match. Jackson: So she connects the dots all the way from a glitch in a college lab to a father being wrongfully taken from his family. Olivia: Yes. She shows that this isn't about a machine misgendering a celebrity. It's about whether we will allow flawed, biased systems to make life-altering decisions about our freedom, our jobs, and our futures. Her work was a key part of the movement that led to the White House releasing a Blueprint for an AI Bill of Rights. Jackson: It's an incredible arc. From a single person's frustration to a global movement and real policy change. It really shows the power of one person asking a simple, persistent question: "Why doesn't this work for me?"
Synthesis & Takeaways
SECTION
Olivia: It really is. The entire story of Unmasking AI is a testament to that. It's a journey that starts with a very personal, almost private, moment of technological failure and blossoms into a global fight for justice. Jackson: So this whole journey starts with a camera not seeing a face, and ends in the White House. It's a powerful reminder that our personal frustrations can sometimes be clues to massive, systemic problems. What feels like an individual inconvenience can be a thread that, when pulled, unravels an entire tapestry of inequality. Olivia: Exactly. And it leaves us with a profound question Buolamwini poses throughout the book: As we build these powerful, world-changing systems, who are we forgetting to see? Who is being left out of the datasets, out of the design rooms, out of the conversation? And what is the cost of that invisibility? Jackson: It's a cost measured in missed job opportunities, wrongful arrests, and the simple, human indignity of not being seen. It makes you look at the 'smart' tech in your own life differently. We'd love to hear from our listeners—have you ever had an experience where technology just didn't seem to be designed for you? Find us on our socials and share your story. Olivia: Because as Joy Buolamwini has shown us, those stories matter. They are the first step toward building a more just and human future. Olivia: This is Aibrary, signing off.