
Facebook's Lethal Carelessness
11 minIntroduction
Narrator: Imagine being tasked with managing the CEO of one of the world’s most powerful companies at a state dinner for over thirty world leaders. Now imagine that CEO is Mark Zuckerberg, and he’s whispering in your ear, "Why are there naked people at a state dinner?" as you walk past performers depicting ancient rituals. This is the chaotic reality Sarah Wynn-Williams, Facebook’s head of international policy, found herself in. The night only got worse, culminating in a blunt rejection from the Prime Minister of Canada, a series of ignored seating arrangements, and a desperate escape through a tunnel filled with galloping horses. This bizarre incident, as Wynn-Williams recounts, was a perfect metaphor for her seven years inside the tech giant: a hopeful comedy that spiraled into darkness and regret.
In her unvarnished memoir, Careless People: A Cautionary Tale of Power, Greed, and Lost, Sarah Wynn-Williams provides a stunning insider account of Facebook's rise. She reveals how a company that promised to connect the world instead became an engine of chaos, driven by leaders who were given superpowers but lacked the maturity or moral compass to wield them.
The Culture of Careless Power
Key Insight 1
Narrator: When Sarah Wynn-Williams, a former diplomat, first joined Facebook, she was struck by a profound culture clash. Her initial attempts to bring a sense of global statecraft to the company were immediately dismissed. A proposal to create a global advisory council of experts was rejected with the curt reply, "We make the decisions." This insular, engineer-centric culture was personified by Mark Zuckerberg himself.
This disconnect was vividly illustrated during a visit from the Prime Minister of New Zealand. Wynn-Williams assumed Zuckerberg would host the head of state, but her suggestion was met with laughter. Mark, she was told, had "no interest in policy or politics" and was not to be bothered. When Zuckerberg did emerge, agitated by a security detail disrupting his engineers, he complained, "Some of my engineers are getting pushed around." He was then forced into an awkward handshake, his irritation clear. It was Sheryl Sandberg who saved the day, not with policy discussion, but with personal charisma, advising the Prime Minister on his Facebook page and her own vacation plans. Wynn-Williams was left stunned by the "complete and utter lack of substance." This experience revealed the core of Facebook's culture: a place where engineering was king, and the immense real-world consequences of its platform were treated as a secondary concern.
The 'Growth-at-All-Costs' Machine
Key Insight 2
Narrator: Facebook's core mission was relentless growth, an ethos embodied by Javier "Javi" Olivan and his team. They were the "beating heart" of the company, responsible for aggressive, rule-bending tactics like importing user contacts without explicit permission and creating the "creepy as hell" People You May Know feature. This "growth-at-all-costs" mentality eventually escalated into a dangerous new strategy.
In August 2015, facing regulatory headwinds in India for its Free Basics program, Mark Zuckerberg convened a meeting. He expressed admiration for Uber's "street fighter tactics"—weaponizing users for protests and compiling opposition research on journalists. He then mandated that Facebook adopt a similar approach, instructing his team to "go on the offensive" against governments. This included mobilizing users as activists, organizing protests, and creating "enemies lists" of anyone who opposed them. When asked who an "adversary" was, Mark’s reply was chillingly simple: "Anyone who opposes us." For Wynn-Williams, this was a horrifying turning point, a moment when the company she joined to connect the world decided to become an adversary to it.
The Hypocrisy of 'Lean In'
Key Insight 3
Narrator: While Sheryl Sandberg's book Lean In became a global phenomenon, the reality for women inside Facebook, especially working mothers, was starkly different. The author discovered a culture of "invisible" motherhood, where personal struggles were expected to be hidden. After returning from a difficult maternity leave, she received negative feedback in a performance review related to her baby. Sheryl’s advice was not systemic support, but a privileged solution: "Hire a nanny. Be smart and hire a Filipina nanny."
This pressure to hide personal life reached an extreme during childbirth. While in active labor, Wynn-Williams received an urgent request from Sheryl for talking points. Despite her husband and doctor’s protests, she felt compelled to work between contractions, pleading, "Please let me push Send." She knew Sheryl wouldn't understand. The hypocrisy was further exposed in an incident on Sheryl's private jet. After a demanding week at Davos, Sheryl, in her pajamas, insisted the heavily pregnant author "come to bed" with her. When the author refused, Sheryl was furious, later confronting her with the words, "You should have got into bed." This incident, and the subsequent professional isolation, revealed the dark side of Sheryl’s inner circle: a place where personal boundaries were violated and obedience was paramount.
The China Compromise
Key Insight 4
Narrator: Mark Zuckerberg viewed entering China as the "last major problem" in his mission to connect the world, a territory to be conquered like a game of Risk. To achieve this, Facebook was willing to make profound ethical compromises. Internal documents discovered by the author, under the codename "Project Aldrin," revealed a pitch to the Chinese government that went far beyond economic benefits.
Facebook offered to help China "promote safe and secure social order," a commitment the author understood as an offer to facilitate state surveillance. The company was willing to store all Chinese user data within China, making it accessible to the government—a concession it refused to make for other countries. Most disturbingly, Facebook’s engineers built sophisticated censorship tools, including an "Extreme Emergency Content Switch" to suppress viral content during sensitive events like the Tiananmen Square anniversary. An internal document explicitly acknowledged that this collaboration could have dire consequences, stating that Facebook employees would be responsible for data responses that "could lead to death, torture and incarceration" for users. Despite this, leadership chose to prioritize direct control and communication with the Chinese government over the safety of its users.
The Autocracy of One
Key Insight 5
Narrator: For years, Facebook had a structured process for content moderation. That system began to crumble in late 2014 after the company blocked a rally page for Russian opposition leader Alexei Navalny. The decision, while consistent with policy, drew criticism that angered Mark Zuckerberg. He was furious that the decision wasn't escalated to him personally.
This incident marked the beginning of a new era. Zuckerberg demanded that all sensitive content decisions from key countries be escalated directly to him. He began personally intervening, ordering the removal of content that did not violate Community Standards to appease governments in Mexico, Indonesia, and South Korea. This created "whiplash" among employees, especially when, in the wake of the Charlie Hebdo attack, Mark publicly declared, "I won’t let [extremists] silence the voices and opinions of everyone else around the world." The new, unwritten policy was formalized in February 2015: content would only be removed if there was a credible threat to block Facebook or arrest its employees. This dangerous edict effectively invited authoritarian governments to use coercion to control speech. The system of checks and balances was gone. Facebook had become, as the author concludes, an "autocracy of one."
Lethal Carelessness in Myanmar
Key Insight 6
Narrator: Nowhere were the consequences of Facebook's failures more catastrophic than in Myanmar. Because of deals that made Facebook the de facto internet for a newly connected population, the platform's flaws had an unparalleled and devastating impact. For years, the author and her team raised alarms about the systemic dysfunctions. Community Standards weren't translated into Burmese, reporting buttons were broken, and the platform didn't even support the local script, making moderation nearly impossible. Leadership consistently ignored pleas to prioritize fixes, instead focusing on projects like building censorship tools for China.
These "sins of omission" had fatal consequences. In 2017, the Myanmar military launched a campaign of atrocities against the Rohingya Muslim minority. A UN report later found that Facebook played a critical role, with over twenty pages detailing how the platform was used to spread hate speech and incite violence. The military had run a massive, organized operation to spread misinformation, which Facebook's algorithms then amplified. The UN investigators highlighted the very same systemic failures—inadequate moderation, language issues, and broken reporting tools—that the author's team had warned about for years. The author's final, damning conclusion is that the genocide was not a result of active malice from Facebook's leadership, but something far more insidious: they simply "didn’t give a fuck."
Conclusion
Narrator: The single most important takeaway from Careless People is that Facebook's global harm was not an accident or an unforeseen consequence of technology. It was the direct result of deliberate choices made by a leadership that consistently prioritized power, growth, and profit over ethical responsibility and human life. At every critical juncture, whether in China, India, or Myanmar, there was an opportunity to choose a different path, but the path of "lethal carelessness" was chosen instead.
The book leaves us with a chilling question as we enter the age of artificial intelligence. The same corporate culture and the same leaders who failed to responsibly manage a social network are now building the next generation of world-altering technology. If this level of carelessness was applied to social media, what will happen when it is applied to AI? The stakes, as the author warns, are now simply too high to ignore.