Aibrary Logo
Podcast thumbnail

The Progress Illusion

12 min

Our Thousand-Year Struggle Over Technology and Prosperity

Golden Hook & Introduction

SECTION

Joe: The Industrial Revolution. We think of steam engines, progress, prosperity. But what if, for the first hundred years, it was a complete disaster for almost everyone? What if real wages didn't rise, and people worked longer, harder, and died younger? That's the bomb this book drops. Lewis: Wait, seriously? That's the opposite of every history class I've ever taken. We were taught that this was the great leap forward for humanity. Are you saying that's wrong? Joe: For a huge portion of the population, for about a century, yes. And that's the provocative heart of the book we're diving into today: Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity by Daron Acemoglu and Simon Johnson. Lewis: And these aren't just any authors. Daron Acemoglu is a giant in economics—he just won the Nobel Prize for his work on how institutions shape prosperity. So when these guys talk about power and progress, people listen. Though I've heard the book has definitely stirred up some debate. Joe: Absolutely. It's been longlisted for major awards but also criticized for being too pessimistic by some. It challenges a very comfortable story we tell ourselves. It all comes down to this one powerful, and frankly, dangerous idea they call the 'productivity bandwagon.' Lewis: The productivity bandwagon... that sounds like a good thing, right? Like a parade of progress? Joe: That's the myth! The idea is that any technology that makes us more productive—that lets us make more stuff with less effort—will automatically make everyone better off. A rising tide lifts all boats. Lewis: Yeah, that makes intuitive sense. More efficient factories mean cheaper goods, which means everyone can afford more. That has to be good for everyone, doesn't it? Joe: You'd think so. But the authors take us on a thousand-year tour of history to show that this bandwagon is often broken. In fact, for most of history, it's been more of a hearse for the working class.

The Great Progress Illusion: Why New Tech Often Made Life Worse

SECTION

Lewis: A hearse? That's a strong word. Give me an example. How can a great invention make life worse? Joe: Let's start with the dawn of the Industrial Revolution. Picture the late 18th century. The philosopher Jeremy Bentham comes up with this brilliant, efficient design for a prison called the Panopticon. It's a circular building where a single guard can see every prisoner without them knowing if they're being watched. Lewis: Right, the ultimate surveillance machine. I know Foucault wrote about it. Creepy, but what does that have to do with factories? Joe: Everything. The book argues the Panopticon's real legacy wasn't in prisons—it was never even built as intended. Its true application was in the new textile factories. Owners saw this model of constant surveillance and control as the perfect way to manage their workforce. They took skilled weavers, broke their craft into tiny, repetitive tasks, and put them in these massive, noisy buildings. Lewis: So the factory wasn't just about using a steam engine, it was about controlling people. Joe: Precisely. It was a technology of control. And the human cost was immense. The book quotes a weaver from 1834 who said, "No man would like to work in a power-loom, there is such a clattering and noise it would almost make some men mad; and next, he would have to be subject to a discipline that a hand-loom weaver can never submit to." For nearly a hundred years, from about 1750 to 1850, real wages for most workers were flat. They worked more hours, in more dangerous conditions, for the same or less pay. The productivity bandwagon had crashed. Lewis: That's just brutal. It sounds less like progress and more like a nightmare. So all the wealth from these new factories just went to the owners? Joe: Exactly. It went to a tiny elite. And this pattern repeats itself over and over. Take another classic invention story: the cotton gin. Lewis: Oh, I know this one, a classic American invention story! Eli Whitney, 1793, revolutionizes cotton production. Joe: A story of productivity, absolutely. But the book asks a devastating question: productivity for whom? They call it 'The Savage Gin.' Because that single invention, which made it vastly more profitable to grow cotton, didn't just fail to help workers. It actively intensified the brutality and scale of American slavery. The demand for enslaved labor exploded. The book documents how about a million enslaved people were forcibly moved to the new cotton plantations, where the work was more regimented and relentless than ever. Lewis: Wow. So the technology itself wasn't the problem, it was the power structure it was dropped into—slavery—that dictated its horrific use. The tech just amplified the existing injustice. Joe: You've nailed it. Technology is an amplifier. It amplifies the vision and the interests of whoever holds the power. If the powerful want to automate jobs to cut costs and control workers, technology will do that. If they want to intensify exploitation, it will do that too. Progress is never automatic. It's a choice. And for most of history, the choice was made by a very small group of people for their own benefit.

The Counter-Punch: How We Fought for and Won Shared Prosperity

SECTION

Lewis: Okay, this is pretty bleak. If progress is an illusion and new tech just makes elites richer, how did we ever get the 40-hour work week, weekends, or a middle class? How did we escape this trap? Joe: That's the hopeful part of the book. We didn't escape because of a new invention. We escaped because people fought back. The authors introduce this idea of 'countervailing powers.' It's the force that pushes back against the elite's control. Lewis: So, a counter-punch to the people in power. Joe: Exactly. And the book points to this fascinating 18th-century writer, John Thelwall. He looked at the horrible new factories and saw something nobody else did. He said that while they created misery, the very act of 'pressing men together' in these workshops created 'a sort of political society, which no act of parliament can silence, and no magistrate disperse.' Lewis: Ah, so the factory was like a forced group chat for revolution! The bosses created the very conditions for their own opposition by putting all these disgruntled people in the same room. Joe: That's a perfect analogy. And that's what happened. Over the 19th and early 20th centuries, workers organized. They formed unions. They fought for the right to vote. Movements like the Chartists in Britain demanded political representation. And slowly, painstakingly, they built enough power to challenge the elite's vision. Lewis: So they got a seat at the table. Joe: They forced their way to the table. And that's when the direction of technology itself started to change. The book highlights the 'glorious years' after World War II, from roughly 1945 to the 1970s. This was a period of unprecedented shared prosperity. And it was built on two pillars. Lewis: Okay, what were they? Joe: First, the technology itself was different. Think of the 'American System of Manufacturing.' Instead of just automating old jobs away, it focused on creating new tasks. The assembly line, for all its faults, created new roles for welders, for mechanics, for managers, for salespeople. It complemented human labor. Second, workers had real power. The New Deal in the US had strengthened unions, and they could now bargain for their share of the productivity gains. Lewis: I think I see. So it's a two-part formula: you need technology that complements humans, creating new things for us to do, AND you need workers to have enough power to demand their fair share of the profits. You need both. Joe: You need both. The book gives this amazing example from the 1960s. General Motors installed a new, numerically controlled drill—a piece of automation. They wanted to pay the operator the same as a manual drill operator. But the United Auto Workers union, the UAW, fought back. They argued that this was a new, more complex task that required more skill and deserved higher pay. They took it to arbitration and they won. Lewis: That's incredible. The union was literally redirecting the benefits of technology back to the worker on the factory floor. Joe: They were making sure the productivity bandwagon actually had passengers. And for a few decades, it worked. Real wages grew in lockstep with productivity. Inequality fell. The middle class boomed. It was a contested path, but it led to real, shared progress.

The Digital Déjà Vu: Are We Repeating History with AI?

SECTION

Joe: Exactly. And that's why the last 40 years have been so damaging. The authors call it 'Digital Damage' because both parts of that formula broke down. Lewis: How so? We have the most advanced technology ever. Isn't AI supposed to be the ultimate human-complementary tool? Joe: That's the 'AI Illusion,' as the book calls it. The authors argue that the current direction of AI is not about complementing humans. It's overwhelmingly focused on three things: automation to cut costs, surveillance to control workers, and maximizing engagement to sell ads. Lewis: The modern Panopticon. Joe: The modern Panopticon in every Amazon warehouse, where workers are tracked down to the second. The book argues that most of today's AI is 'so-so automation'—it's just good enough to replace a human, but not good enough to create spectacular productivity gains. So you get the social cost of job loss without the broad economic benefit. Lewis: And at the same time, the other pillar—worker power—crumbled. Joe: It was dismantled. The book points to the 1980s, with leaders like Ronald Reagan firing the striking PATCO air traffic controllers, sending a powerful message to the private sector that it was open season on unions. This was combined with the rise of the 'shareholder value' ideology, championed by economists like Milton Friedman, which said a company's only responsibility is to its shareholders. Not its workers, not its community. Lewis: So we're back in the early Industrial Revolution, but this time the 'robber barons' are tech billionaires and the 'dark, satanic mills' are our social media feeds, designed to make us angry to keep us scrolling. Joe: That's a fantastic way to put it. The book argues that the business model of platforms like Facebook and YouTube is a 'socially bankrupt web.' Their algorithms are designed to maximize engagement, and outrage is a powerful driver of engagement. So they inadvertently amplify misinformation, hate speech, and polarization because it's profitable. They are undermining democracy for profit. Lewis: And the authors are saying this isn't an accident of the technology, it's a choice about its direction. Joe: That's the core argument of Power and Progress. The struggle is the same. Technology is not destiny. Its path is a choice. The book is a call to action to redirect it. They lay out a three-pronged formula for how to do this, drawing lessons from the Progressive Era that took on the original robber barons and the environmental movement that is successfully redirecting energy technology. It involves changing the narrative, building countervailing power in civil society and labor, and implementing smart, specific policies.

Synthesis & Takeaways

SECTION

Lewis: This is a massive, sweeping argument. It's both terrifying and a little bit hopeful. So what's the one big takeaway for us, right now, living in the middle of this AI revolution? Joe: I think it's that we have to stop being 'mesmerized by tech billionaires,' as the book says. We can't just passively accept their vision of the future. Progress is never automatic. The direction of AI isn't a force of nature like the wind; it's a choice. And the book's most powerful message, which comes from a review by Nobel laureates Banerjee and Duflo, is that 'Humans... remain in the driver's seat. It is still our job to determine whether the vehicles we build are heading toward justice or down the cliff.' Lewis: That's a heavy responsibility. It makes you wonder, for every new app or AI tool we use, who is it really serving? Who has the power? And what are we, as users and citizens, implicitly agreeing to when we click 'accept'? Joe: A question we should all be asking. This is Aibrary, signing off.

00:00/00:00