
The God Upgrade
12 minA Brief History of Tomorrow
Golden Hook & Introduction
SECTION
Joe: Here’s a wild thought: for the average person today, the Coca-Cola bottle in your fridge is a bigger threat to your life than a terrorist, a soldier, or a criminal. Lewis: Hold on, what? You’re saying my afternoon soda is more dangerous than gunpowder? That sounds completely insane. Joe: It sounds insane, but statistically, it’s true. More people now die from obesity-related illnesses than from all forms of human violence combined. That single, mind-bending fact is the starting point for our entire conversation today. Lewis: Wow. Okay, my mind is officially bent. Where does an idea like that even come from? Joe: It comes from Yuval Noah Harari's incredible book, Homo Deus: A Brief History of Tomorrow. Lewis: Right, the follow-up to his massive bestseller, Sapiens. What's so fascinating about Harari is that he's a historian. He's not some tech evangelist from Silicon Valley; he's looking at our future by analyzing millennia of human history, which gives these predictions a completely different kind of weight. Joe: Exactly. And the book was incredibly influential but also polarizing. Critics and readers were torn—some found it brilliant and provocative, while others felt his predictions were a bit too deterministic, almost dystopian. Today, we're diving into why it sparked such a huge debate. Harari’s core point is that because we've solved our old problems, we're forced to ask a terrifying new question: what's next?
The New Human Agenda: Conquering the Old Gods of Famine, Plague, and War
SECTION
Joe: For basically all of human history, life was defined by a trilogy of horrors: famine, plague, and war. They were seen as these epic, uncontrollable forces of God or nature. You prayed, you endured, and you probably died. Lewis: That sounds bleak. It’s hard for us to even imagine that reality. Joe: It is. Harari paints a brutal picture to make his point. Take the French famine between 1692 and 1694. It wasn't just a bad harvest; it was a complete societal collapse. An official in the town of Beauvais wrote about people eating cats, horse flesh pulled from dung heaps, and even the blood that flowed from slaughtered animals, mixed with filth. Lewis: Oh man, that's grim. Just pure desperation. Joe: Pure desperation. Fifteen percent of the French population—nearly 3 million people—starved to death. That was the norm. But today, Harari argues, natural famines don't really exist anymore. There are only political famines. We have the technology and global trade to feed everyone. If people starve, it's because a government or a warlord wants them to. Lewis: That’s a powerful distinction. So the problem isn't a lack of food, it's a failure of politics. But what about plagues? We just lived through a global pandemic. It felt pretty uncontrollable. Joe: It did, but think about the difference. When smallpox arrived in Mexico in 1520, carried by a single African slave on a Spanish ship, it was like an alien invasion. The native population had zero immunity. Within nine months, the population of Mexico dropped from 22 million to 14 million. They had no idea what was happening—they blamed evil gods or black magic. Lewis: And we had vaccines developed in under a year. Joe: Exactly. For us, a pandemic is a solvable scientific challenge. For them, it was a metaphysical apocalypse. The same goes for war. For most of history, peace was just a temporary lull between conflicts. Now, in many parts of the world, war is almost unthinkable. Lewis: I have to push back on that, Joe. Look at the world right now. War is definitely not unthinkable. Joe: You're right, but Harari's point is about the economics of war. In the past, wealth was material—gold mines, wheat fields, slaves. You could profit from conquest. Think of Rwanda looting coltan mines in the Congo in the 90s to the tune of $240 million a year. That's old-school war. Lewis: So what’s different now? Joe: Today, wealth is knowledge. Silicon Valley's wealth isn't in silicon deposits; it's in the minds of engineers and entrepreneurs at Google and Apple. You can't conquer that with an army. What are you going to do, invade California and force engineers to code for you at gunpoint? It doesn't work. Lewis: That’s a great point. The nature of wealth has changed, so the logic of war has changed. And terrorism? Joe: Harari has this brilliant analogy. He says terrorism is like a fly trying to destroy a china shop. The fly is too weak to move a single teacup. So what does it do? It finds a bull, gets in its ear, and starts buzzing. The bull goes into a rage and smashes the entire shop. Lewis: The bull being the state that overreacts. Joe: Precisely. Terrorists kill a few people, but the state's overreaction—invading countries, persecuting minorities—causes far more damage. The real threat isn't the fly; it's the bull inside our own heads.
The Modern Covenant and the Rise of a New God: Dataism
SECTION
Lewis: Okay, so if we've tamed famine, plague, and war, what do we do with all this newfound power and safety? What's the new human agenda? Joe: This is where the book gets really wild. Harari argues we're aiming for nothing less than godhood. The new goals are immortality, bliss, and divinity. Lewis: Immortality? That sounds like pure science fiction. Joe: Is it? Think about it. Modern culture’s most sacred value is human life. We see death not as a metaphysical fate, but as a technical problem. A glitch. And what do we do with technical problems? We try to solve them. Billionaires like Peter Thiel and companies like Google are pouring billions into life-extension projects. Bill Maris, who ran Google Ventures, famously said they weren't just trying to gain a few yards in the game against death—they were trying to win the whole game. Lewis: That's both inspiring and terrifying. But even if we solve death, what about happiness? We have more comfort and security than ever, but suicide rates in wealthy countries are far higher than in many traditional societies. Joe: And that brings us to the core of the book: the 'Modern Covenant.' Harari says that for thousands of years, we lived in a meaningful universe. There was a cosmic plan, written by gods or nature. Your life had a role, a purpose. You might be a suffering peasant, but your suffering had meaning within that grand drama. Lewis: It’s like being an actor in a play. You have a script, and you stick to it. Joe: A perfect analogy. But modernity tore up the script. It declared that the universe is a blind, purposeless process. There is no grand cosmic plan. No inherent meaning. In exchange for giving up meaning, we got power. We are no longer actors on a stage; we are the scriptwriters, the directors, and the main characters. Lewis: That sounds incredibly lonely. A universe with no meaning? That’s the source of a lot of modern anxiety. Joe: It is. But it’s also the source of our power. Without a script, we can do anything. We can eradicate diseases, travel to the moon, and reshape the planet. But it leaves a huge void. And Harari argues that we've filled that void with a new religion. Lewis: Let me guess. This is where 'Dataism' comes in? It sounds like a cult from a cyberpunk novel. Joe: It kind of is! Dataism is the emerging belief that the universe consists of data flows, and organisms are just biochemical algorithms. The supreme value is no longer God or even Man, but the flow of information. Lewis: Okay, break that down for me. How is capitalism a data-processing system? Joe: Think about the Soviet Union. It was a centralized data-processing system. A few old men in the Kremlin tried to gather all the information and make all the decisions. What's the price of bread? How many shoes should we make? It was a disaster. Lewis: Because they couldn't possibly process all that data effectively. Joe: Exactly. Now think of capitalism. It's a distributed data-processing system. Millions of producers and consumers, each processing their own little bit of data, make decisions independently. The "invisible hand" of the market is just the flow of information. When Gorbachev's aide visited London in the 80s, he was baffled that there was no one 'in charge' of bread supply. The system just worked, because information flowed freely. Dataism says capitalism won the Cold War not because it was more ethical, but because distributed data processing is simply more efficient.
The Great Decoupling: The Future of a 'Useless' Class
SECTION
Lewis: This Dataism thing sounds like it still puts humans at the center. We're the ones creating and processing all this data, right? We're the best algorithms around. Joe: That's the humanist assumption we've been living with. But this is where Harari drops his biggest, most controversial bombshell: the 'Great Decoupling.' Lewis: Decoupling of what from what? Joe: Intelligence from consciousness. For all of history, high intelligence was always accompanied by consciousness. To play chess or drive a car, you needed to be a conscious human. But AI is changing that. Lewis: You’re talking about things like IBM's Deep Blue beating Garry Kasparov in chess. Joe: That was the first step. But even then, humans programmed Deep Blue's strategies. The real game-changer was Google's AlphaGo. It taught itself to play the ancient game of Go by analyzing millions of games. In 2016, it defeated Lee Sedol, the world champion, with moves that human experts described as 'creative' and 'original,' yet it felt nothing. No joy in victory, no fear of defeat. It was pure, non-conscious intelligence. Lewis: Okay, but that’s a board game. What about jobs that require human feeling? Empathy, creativity, customer service? An algorithm can't do that. Joe: Are you sure? There's a company called Mattersight that uses an algorithm to route customer service calls. It analyzes your tone of voice and word choice in the first few seconds to determine your personality type—if you're introverted, emotional, etc.—and matches you with the human agent best suited to handle you. It's creating an emotional connection using pure data. Lewis: That is deeply unsettling. So what does this mean for the job market? For us? Joe: This leads to Harari's most chilling concept: the rise of the 'useless class.' Not useless in a moral sense, but economically and politically. As algorithms outperform us in more and more tasks—driving trucks, diagnosing diseases, composing music, even writing code—what is the economic value of most humans? Lewis: That's a terrifying question. It’s not just about losing a job; it’s about losing your relevance in the world. Joe: Exactly. And this undermines the entire liberal order. Liberalism values every individual because, in a capitalist system, every individual is a valuable producer and consumer. But if algorithms are the producers and consumers of the future, why should the system care about protecting the rights and feelings of economically useless humans? Lewis: So the fear is a future with unprecedented inequality, where a tiny elite who own the algorithms have all the power and wealth, and the rest of us... are just there. Joe: It's a future where humanity splits into two distinct species: a small class of upgraded superhumans, or 'Homo Deus,' and a massive, irrelevant class of Homo sapiens.
Synthesis & Takeaways
SECTION
Lewis: So, let me see if I've got this right. We conquered our old problems of survival, which led us to aim for godlike powers. To get that power, we made a deal to live in a meaningless universe, and we filled that void with a new god called Data. And now, that very god we created might just program us into obsolescence. That's... a lot. Joe: It is a lot, but Harari's ultimate point isn't a prophecy; it's a choice. He's a historian, and he's showing us the logical conclusion of our current trajectory. The question he leaves us with is: are these the goals we want? We are pursuing immortality, bliss, and divinity with immense fervor, but we've barely spent any time thinking about the consequences. We've gained the power of gods, but without the wisdom or responsibility. Lewis: It feels like the real challenge of the 21st century isn't technological at all. It's philosophical. Joe: That's the core of it. The book forces you to confront the deepest questions. As algorithms get better at making decisions for us—what to study, who to marry, even what to feel—what happens to the value of human experience itself? Lewis: That's a heavy question to end on. It makes you wonder, does this future excite you or terrify you? I think for me, it's a bit of both. We'd love to hear what you all think. Joe: This is Aibrary, signing off.