
Wielding Data for Justice
10 minIntroduction
Narrator: In 1967, a brilliant young Black woman named Christine Darden started her career at NASA. With a master's degree in applied math, she was a "human computer," performing complex calculations crucial for the space race. Yet, she noticed a disturbing pattern: men with the exact same credentials were hired as engineers, with clear paths for promotion, while she and other women were stuck in dead-end computing pools. When she finally confronted her boss, his response was dismissive: "Well, nobody’s ever complained." Her experience, her lived reality of discrimination, was invisible to the system. It wasn't until years later, when an ally visualized the stark gender disparity in promotion rates with a simple bar chart, that management was shocked into action. Darden was promoted, eventually becoming the first Black woman at Langley to reach the top rank in the federal civil service.
This struggle—the need for data to make injustice visible and challenge entrenched power—is the central theme of the groundbreaking book Data Feminism by Catherine D’Ignazio and Lauren F. Klein. It argues that data is never neutral. Instead, it is a tool that can either reinforce existing inequalities or be wielded to dismantle them.
Data Is Shaped by Power
Key Insight 1
Narrator: The first principle of data feminism is to "Examine Power." The authors argue that data science is not an objective field floating above society; it's a product of it. And our society is a "matrix of domination," a term from sociologist Patricia Hill Collins, where power and privilege are unequally distributed based on race, gender, class, and ability. This inequality creates a "privilege hazard" in the tech world. Because data science is overwhelmingly dominated by white, able-bodied, cisgender men, the systems they build often reflect their own experiences and fail to account for the realities of marginalized groups.
A stark example is the work of MIT researcher Joy Buolamwini. When working with facial recognition software, she discovered it couldn't detect her dark-skinned face. The system only recognized her when she put on a white mask. Her subsequent research with Timnit Gebru revealed why: the training datasets were up to 84% white and 78% male. As a result, the software was up to 44 times more likely to misclassify a dark-skinned woman than a light-skinned man. This isn't just a technical glitch; it's a reflection of who holds the power to design technology and whose reality is considered the default.
Data Can Be Used to Challenge Power
Key Insight 2
Narrator: While data can reinforce oppression, it can also be a powerful tool for resistance. The second principle, "Challenge Power," advocates for using data science to hold institutions accountable and work toward justice. This often involves collecting "counterdata" when official sources fail.
A historical example is the work of the Detroit Geographic Expedition and Institute (DGEI) in the late 1960s. Community organizer Gwendolyn Warren and local Black youth knew that white commuters were killing Black children in traffic accidents, but there was no official data. The DGEI collaborated with geographers to access police records and create their own map, provocatively titled "Where Commuters Run Over Black Children." The map used sharp black dots to visualize the deaths, making the structural problem undeniable. This stands in stark contrast to the infamous "redlining" maps of the same era, where powerful institutions used cartography to deny loans to Black neighborhoods, institutionalizing racial inequality. The DGEI's map shows how the same tool—a map—can be used by a community to fight for justice.
The Myth of Objectivity Must Be Dismantled
Key Insight 3
Narrator: Data science has long been obsessed with objectivity, a "view from nowhere" that feminist scholar Donna Haraway calls the "god trick." The third principle, "Elevate Emotion and Embodiment," challenges this idea. It argues that all data visualizations are rhetorical—they make choices that frame a story. Rather than shunning emotion, data feminism embraces it as a valid and powerful way to communicate truth.
In 2013, the design firm Periscopic created a visualization of gun deaths in the United States. Instead of a sterile bar chart, it showed an animated arc for each life lost, displaying the victim's age and the "stolen years" they could have lived. The visualization was framed around the emotion of loss, allowing viewers to feel the scale of the tragedy. While some critics argued it wasn't "neutral," the project powerfully communicated the human cost of gun violence in a way a simple chart never could. It proves that data communication can be both factually rigorous and emotionally resonant, leading to a deeper, more human understanding.
Hierarchies and Binaries Must Be Re-thought
Key Insight 4
Narrator: The fourth principle, "Rethink Binaries and Hierarchies," scrutinizes how classification systems themselves can be tools of oppression. As geographer Joni Seager states, "what gets counted counts." When systems force complex identities into rigid boxes—like the gender binary—they erase and harm those who don't fit.
Sasha Costanza-Chock, a nonbinary design professor, describes this as "administrative violence." When they go through an airport scanner, a TSA agent must press either a blue "male" or pink "female" button, loading a normative body profile. Because Costanza-Chock's body doesn't conform to either binary, the machine flags them for a pat-down. The system isn't just flawed; it's designed around a binary that pathologizes and punishes non-normative bodies. This principle extends to everyday design, like the notoriously small pockets in women's jeans, a subtle but pervasive design choice that reinforces gendered limitations. Data feminism insists on questioning these categories and designing more inclusive systems.
Context Is Everything
Key Insight 5
Narrator: Data never exists in a vacuum. The fifth principle, "Consider Context," argues that numbers do not speak for themselves. Data is always "cooked" by the social, historical, and institutional environment in which it was created. Ignoring this context can lead to disastrous misinterpretations.
In 2014, the news site FiveThirtyEight published a story claiming that kidnappings in Nigeria were skyrocketing, using data from a massive database called GDELT. The report was quickly debunked. GDELT doesn't count unique events; it counts media reports about events. The recent kidnapping of 276 schoolgirls by Boko Haram had generated immense global media coverage, and GDELT had logged each news story as a separate "pseudo-event." FiveThirtyEight had mistaken media chatter for a crime wave. This failure to understand the data's context—its "knowledge infrastructure"—led to a completely false conclusion, illustrating the dangers of what the authors call "Big Dick Data": projects that fetishize size and scale while ignoring crucial context.
The Invisible Labor of Data Must Be Made Visible
Key Insight 6
Narrator: The final principle, "Make Labor Visible," challenges the myth of the solitary data genius. Data science is the product of many hands, but much of that work is rendered invisible. This "ghost work" includes everything from data entry and content moderation to the extraction of minerals needed for our devices.
In their project "Anatomy of an AI System," researchers Kate Crawford and Vladan Joler created a massive diagram tracing every resource required to produce a single Amazon Echo. Their work exposed the "fractal chains of production and exploitation," from child labor in Congolese cobalt mines to underpaid workers in Chinese factories and precarious "microworkers" on platforms like Amazon Mechanical Turk labeling data to train the AI. This invisible labor is disproportionately performed by women and people of color in the Global South. Making this labor visible is a feminist act that forces us to confront the true human and environmental cost of our data-driven world.
Conclusion
Narrator: The single most important takeaway from Data Feminism is that data is inseparable from power. It is not an abstract, objective truth, but a product of human choices, biases, and social structures. Because of this, it can be—and has been—used to surveil, control, and oppress. However, the seven principles of data feminism offer a powerful alternative. They provide a framework for using data to challenge those same power structures, to expose injustice, and to build a more equitable world.
The book leaves us with a profound challenge: to stop seeing data as a neutral force and start seeing it as a reflection of our values. The next time you encounter a statistic, a chart, or an algorithm, don't just ask what it shows. Ask who it empowers, who it harms, and whose labor made it possible. By asking these questions, we can begin to multiply the work of data feminism in our own lives.