
The Great Multiplier
11 minBold Solutions for a Broken System
Golden Hook & Introduction
SECTION
Olivia: Most people think diversity is the answer to fixing broken systems. But what if I told you that for Black experts, being the 'diverse' person in the room is often a form of erasure, a way of being silenced while in plain sight? Jackson: Wait, erasure? That sounds like the exact opposite of what we're told. How can being more visible lead to being erased? That feels like a contradiction. Olivia: It’s a powerful and counterintuitive idea, and it’s at the very heart of the book we’re diving into today: The Black Agenda: Bold Solutions for a Broken System, edited by Anna Gifty Opoku-Agyeman. Jackson: I've heard this book is a powerhouse. What’s the story behind it? Olivia: What's amazing is that Opoku-Agyeman, a young Ghanaian-American researcher and entrepreneur, curated this collection of essays from over 30 Black scholars and experts. She did this in the wake of 2020, channeling the urgency of the pandemic's disproportionate impact and the global protests for racial justice into a blueprint for real, systemic change. It’s been highly praised for that very reason. Jackson: That context is everything. It wasn't just an academic exercise; it was a response to a crisis. So where does this idea of 'erasure' come from? Let's start there.
The Erasure and Necessity of Black Expertise
SECTION
Olivia: It comes from the foreword, written by the brilliant sociologist Dr. Tressie McMillan Cottom. She lays out this foundational argument that institutions love to increase representation—to have more Black faces in the room—but they often fail to provide the actual power, status, and resources that would come with that position if the person were white. Jackson: Okay, so it's tokenism, but with a sharper edge. You're visible, but you're powerless. Olivia: Exactly. Cottom has this killer line: "Having more Black people in the room without extending them the commensurate status and power and resources... is erasure." You become a symbol of progress, which prevents any real progress from happening. You're there to perform diversity, not to wield expertise. Jackson: That’s heavy. Can you give me an example of what that actually looks like? Olivia: The book describes this common experience for Black experts. They're invited to a meeting or a panel, but they're confined to what's called 'race talk.' An economist who is Black might be asked about racial inequality in the economy, but never about monetary policy in general. A Black climate scientist is asked about environmental justice, but not the core atmospheric science. Jackson: Wow, so their expertise is siloed. They're only allowed to be an expert on their own identity. Olivia: Precisely. Or their ideas are dismissed, only to be celebrated when a white colleague presents the very same concept later. It's a constant battle to have your earned status—your credentials and knowledge—override your inherited status in a society that still operates on racial hierarchies. Jackson: I'm thinking about the editor's introduction now. She tells this personal story about being afraid to ask questions in school, because she feared it would reinforce negative stereotypes about Black women. It sounds like that same fear of not being seen as a true expert carries into the professional world. Olivia: It's a perfect connection. That vulnerability is weaponized. The book is essentially built around the central question the editor poses after seeing Black experts ignored during the COVID-19 crisis, even as it ravaged their communities. She asks, "Do Black experts matter?" Jackson: And the answer the book seems to give is, "Yes, but not in the way you think." It's not just about having them in the room. Olivia: It's about recognizing that their lived experience is an inseparable part of their expertise. The book quotes James Baldwin: "Not everything that is faced can be changed, but nothing can be changed until it is faced." Black experts have faced the broken systems firsthand. Their knowledge isn't just theoretical. Jackson: That makes so much sense. You can't fix a problem you don't truly understand from the inside out. Olivia: And that’s the launchpad for the entire book. It shows how centering this kind of expertise completely reframes our biggest problems.
Redefining the Problem: From Climate Change to Climate Justice
SECTION
Jackson: Okay, so let's see it in action. You mentioned climate change. That's an issue that's usually framed as a universal threat to all of humanity. How does this book change that conversation? Olivia: It demolishes that very idea. One of the essayists, the climate writer Mary Annaïse Heglar, has this incredibly sharp quote: "Climate change is not the Great Equalizer. It is the Great Multiplier." Jackson: The Great Multiplier. What does she mean by that? Olivia: She means it takes all the existing inequalities in our society—in housing, health, and wealth—and makes them exponentially worse. The book uses the work of Dr. Marshall Shepherd, a meteorologist, to explain something called the "weather-climate gap." Jackson: A gap? Tell me more. Olivia: Shepherd points out that due to historical injustices like redlining and discriminatory housing policies, Black communities are often located in environmentally vulnerable areas. They're in low-lying regions prone to flooding, or in urban "heat islands" with less green space and more pavement, making heatwaves more lethal. Jackson: So it's like a pre-existing condition for a whole community. A hurricane or a heatwave is the virus that hits everyone, but it's going to be far more devastating for those who are already vulnerable. Olivia: That's a perfect analogy. They have less insurance, fewer resources to evacuate or rebuild, and often receive less government aid after the fact. So when a climate-fueled disaster strikes, it doesn't equalize everyone; it multiplies the disadvantage that was already there. Your zip code becomes a more accurate predictor of your climate risk than your personal carbon footprint. Jackson: That completely changes the focus. Suddenly, talking about metal straws and recycling bins feels… inadequate. Olivia: Deeply inadequate. The book argues that this makes climate change a fundamental civil rights issue. The solutions can't just be about green technology. They have to be about dismantling the systems that create these vulnerable communities in the first place. Jackson: So what is the 'bold solution' here? Is it just about building better sea walls in those neighborhoods? Olivia: That's part of it, but the vision is much bigger. It's about policy. It's about reparations for historical injustices. It's about ensuring that the transition to a green economy creates wealth and opportunity for these communities, not just for tech billionaires. It's about recognizing that you can't have climate justice without racial justice. They are the same fight. Jackson: Wow. Okay, so that's a massive, present-day problem being reframed. But the book also looks to the future, right? I'm really curious about this 'algorithmic assault' idea you mentioned.
The New Frontier of Bias: Algorithmic Assault
SECTION
Olivia: Yes, this is where the agenda gets really forward-looking and, frankly, a bit terrifying. The technology section argues that we are building the next generation of systemic bias right into the code that runs our world. Jackson: I think a lot of people see technology as objective. Code is just math, right? How can it be biased? Olivia: Because it's created by biased humans and trained on biased data from a biased world. Professor Brandeis Marshall coins the term "algorithmic assault" to describe what happens next. She calls it "codified attacks on Black bodies through digital mechanisms." Jackson: That is a powerful phrase. Break that down for me. What does an 'algorithmic assault' actually look like for a regular person? Olivia: The book gives a chilling example from the work of computer scientists Joy Buolamwini and Deborah Raji. They audited Amazon's facial recognition software, Rekognition. They found that while it was nearly perfect at identifying lighter-skinned men, its error rate for darker-skinned women was a staggering 35%. Jackson: Thirty-five percent? That’s not a small glitch. That’s a fundamentally broken system. You could be misidentified as a criminal, denied access to a building, all because the algorithm literally wasn't designed for you. Olivia: Exactly. And it gets worse. Think about AI used for loan applications, which are trained on historical loan data. If that data reflects decades of discriminatory lending practices, the AI learns to replicate that discrimination. It denies loans to qualified Black applicants because it sees a pattern of historical bias and mistakes it for a pattern of risk. Jackson: So it's laundering old-school redlining through new-school technology and calling it an objective decision. Olivia: You've got it. And the book points out the human cost of fighting this. It highlights the story of Dr. Timnit Gebru, a leading Black woman in AI ethics, who was famously ousted from Google after co-authoring a paper that raised concerns about the dangers of large-scale AI models, including their environmental impact and built-in biases. Jackson: So the very people pointing out the problem are being silenced by the institutions creating it. It's the same pattern of erasure we talked about at the beginning. Olivia: It's the exact same pattern, just happening on a new frontier. This is why the book calls for "digital smarts." It's not just about being tech-savvy; it's about being critically aware of how these systems can be used against you. Jackson: This is so urgent. With AI becoming part of everything, from job applications to medical diagnoses, this isn't a future problem, is it? This is happening right now. So what's the solution? Do we just unplug everything? Olivia: The solutions proposed are, again, systemic. They call for things like mandatory external audits of algorithms before they're deployed, strong legal protections and accountability for tech companies, and, most importantly, using precise language. Stop calling it "AI bias" as if it's a neutral, technical glitch. Call it what it is: automated discrimination.
Synthesis & Takeaways
SECTION
Olivia: And that really ties everything together. Whether it's a corporate boardroom, a climate policy summit, or a tech lab in Silicon Valley, the book's message is the same. Jackson: The people who design the system without the deep, lived-in expertise of those it harms will always, even unintentionally, replicate injustice. Olivia: Exactly. The system will always default to serving those who built it. The entire Black Agenda is a call to fundamentally redesign those systems by redefining who gets to be the architect. Jackson: So if there's one thing we should really take away from this incredibly dense and powerful book, what is it? Olivia: I think it's this: The book argues that expertise isn't just about what you know; it's about what you've lived. And ignoring that lived experience isn't just an oversight—it's a deliberate choice that leads to broken systems. The solution isn't just about adding more voices or checking a diversity box. It's about fundamentally changing who gets to write the code, who gets to draft the policy, and who gets to define the problem in the first place. Jackson: That’s a profound shift in thinking. It really makes you think... in your own world, at your job or in your community, whose expertise is being centered, and whose is being erased? Olivia: A question worth asking everywhere. Jackson: A powerful and necessary book. Thank you, Olivia. Olivia: This is Aibrary, signing off.