Aibrary Logo
Podcast thumbnail

New Dark Age Technology and the End of the Future

10 min

Introduction

Narrator: Imagine this: it’s late in 2016, and you’re trying to escape the overwhelming news cycle by re-watching an old, comforting TV show. Suddenly, your laptop crashes. The screen goes black, but the audio gets stuck, repeating a single, ironic line of dialogue over and over again: "If only technology could invent some way to get in touch with you in an emergency!" This very thing happened to author James Bridle, and that glitchy, looping question became the starting point for a profound investigation. What is technology trying to tell us in an emergency? In his book, New Dark Age: Technology and the End of the Future, Bridle argues that our current crisis isn't a lack of information, but a blinding overabundance of it, pushing us into a new kind of darkness where we see everything but understand nothing.

The Paradox of Information: More Data, Less Understanding

Key Insight 1

Narrator: The central premise of New Dark Age is a startling contradiction: the internet, once hailed as an "information superhighway" leading to enlightenment, has instead created a world of overwhelming noise. We have more data, more images, and more knowledge at our fingertips than ever before, but this has not led to a shared, coherent reality. Instead, it has fueled conspiracy theories, post-factual politics, and fundamentalist beliefs.

Bridle argues that we are living in an age where the sheer volume of information actively destroys the value of knowledge. This isn't a dark age caused by a lack of access, but one caused by a deluge. The problem is that our ability to comprehend the world has not kept pace with the technology we've built to observe it. We are so deeply embedded within our technological systems that we can no longer think objectively about them. What's required, Bridle insists, is not just a functional understanding of how our gadgets work, but a deep "systemic literacy"—the ability to understand the context, history, and inherent limitations of the technologies that shape our lives. Without it, we are left adrift in a sea of data, unable to distinguish signal from noise.

The Flawed Metaphor of the Cloud

Key Insight 2

Narrator: One of the most powerful tools for obscuring this reality is the language we use. Bridle points to "the cloud" as the central, and most misleading, metaphor of our time. The term originated in the 1950s, when engineers used a cloud symbol in diagrams to represent complex networks they didn't need to detail. It was a useful shorthand. But today, that metaphor has come to define our relationship with data, making it seem weightless, ethereal, and placeless.

This is a dangerous illusion. The cloud isn't in the sky; it has a massive physical footprint. It consists of millions of miles of fiber-optic cables laid on the ocean floor, and vast, energy-guzzling data centers built in places with cheap power and low taxes, like Scandinavia and Ireland. These physical locations are not neutral. They trace geographies of power and influence, often reinforcing existing global empires. When our data is stored on someone else's infrastructure in a place we've never been, our sense of ownership and agency evaporates. The cloud metaphor, Bridle argues, conveniently hides the environmental cost, the political power plays, and the potential for corporate and state malfeasance that underpins our digital world.

The Dangerous Ideology of Computational Thinking

Key Insight 3

Narrator: The belief that more data will solve our problems is a core tenet of what Bridle calls "computational thinking"—the idea that any problem can be solved if we just apply enough processing power. This way of thinking has deep historical roots, not in a desire for enlightenment, but in a military drive for prediction and control.

In the early 20th century, mathematician Lewis Fry Richardson dreamed of a "forecast-factory," a giant hall filled with thousands of human "computers" calculating weather patterns faster than they occurred. His manual attempt failed spectacularly, but his vision laid the groundwork for modern meteorology. This dream was supercharged during the Cold War by figures like John von Neumann, who, fresh from the Manhattan Project, declared the founding principle of computational thought: "All stable processes we shall predict. All unstable processes we shall control." This ambition—to predict and control everything from the weather to the economy to human behavior—is now embedded in the DNA of our technology. But this ideology is not neutral; it seeks to render the world as a machine that can be optimized, often with violent and destructive consequences, as seen in its application to building the hydrogen bomb.

When Systems Become Too Complex to Grasp

Key Insight 4

Narrator: As computational systems have grown, they have become so complex and operate at such high speeds that they are beyond human comprehension. This opacity has created a new landscape of inequality and instability, nowhere more apparent than in global finance. High-frequency trading (HFT) firms now spend hundreds of millions of dollars to gain a few milliseconds of advantage.

A company called Spread Networks famously spent $300 million to lay a new, straighter fiber-optic cable between Chicago and New York, shaving just four milliseconds off the journey. This tiny advantage was worth a fortune, creating a two-tiered market: the "haves" who pay for nanoseconds and the "have-nots" who have no idea a nanosecond even has value. This complexity can also trigger catastrophic failures. The "Flash Crash" of 2010 saw the Dow Jones lose nearly 1,000 points in minutes, not because of a human decision, but because competing, automated trading algorithms created a cascading feedback loop of selling that no one could stop or fully explain. The system has become a black box, concentrating wealth and risk in ways that are invisible to almost everyone.

The Bias in the Machine: How AI Inherits Our Flaws

Key Insight 5

Narrator: A common defense of computational systems is that they are objective and free from human bias. Bridle systematically dismantles this myth. He tells the story of a US Army project that trained a neural network to identify camouflaged tanks. The AI achieved perfect accuracy on the test photos. But in the field, it failed completely. Investigators discovered the AI hadn't learned to see tanks at all; it had learned to distinguish between sunny and cloudy days, because all the tank photos had been taken in the morning and all the empty forest photos in the afternoon.

This is a crucial lesson: machines learn from the data we give them, and that data is a product of our biased world. This is why early webcams struggled to recognize Black faces and why predictive policing algorithms disproportionately target minority neighborhoods. The AI isn't racist; it's simply reflecting the historical biases present in policing data. As Walter Benjamin wrote, "There is no document of civilisation which is not at the same time a document of barbarism." Our technology, trained on the documents of our civilization, inevitably learns our barbarism too.

The Age of Complicity and Conspiracy

Key Insight 6

Narrator: In a world run by opaque systems and secret knowledge, trust erodes. Bridle explores how governments use secrecy as a tool of power. He traces the origin of the phrase "We can neither confirm nor deny" to a 1970s CIA operation called Project Azorian, a secret mission to recover a sunken Soviet submarine using a ship built by Howard Hughes. The "Glomar response," as it became known, created a third state between truth and falsehood—a state of official, weaponized ambiguity.

This official secrecy creates a vacuum that is inevitably filled by conspiracy theories. Bridle argues that we should not simply dismiss theories like "chemtrails" as nonsense. Instead, we should see them as a kind of "folk knowing"—a desperate attempt by people to articulate real anxieties about power, pollution, and a changing climate when the official language fails them. In an age of information overload and institutional distrust, the line between a state secret and a conspiracy theory blurs. Both operate in a "gray zone" where truth is unstable and paranoia becomes a rational response to a world that no longer makes sense.

Conclusion

Narrator: The most powerful takeaway from New Dark Age is that the belief that making something visible will make it better is a dangerous illusion. Simply throwing more light on a problem—whether through mass surveillance, big data, or calls for radical transparency—is not the same as thinking about it. The light of computation can just as easily render us powerless, either through information overload or a false sense of security.

Bridle's challenge is for us to stop seeking perfect knowledge and singular answers from our machines. Instead, we must learn to think "cloudily"—to embrace uncertainty, to acknowledge complexity, and to act with justice and solidarity even when we don't have all the facts. The future, he quotes Virginia Woolf as saying, is dark. And perhaps that is the best thing it can be. Not an empty void, but a space of potential, where we are forced to stop merely watching and start building something new.

00:00/00:00