Aibrary Logo
Podcast thumbnail

I, Robot: The Artichoke Cut

11 min

The Illustrated Screenplay

Golden Hook & Introduction

SECTION

Rachel: Okay, Justine. When I say "I, Robot," what's the first image that pops into your head? Justine: Easy. Will Smith in a black leather jacket, looking very suspicious of a shiny, vaguely Apple-designed robot. And probably an explosion. Maybe two. Rachel: That's what I thought you'd say. And that's exactly why today we're talking about the real I, Robot movie. The one that was never made. Justine: Hold on, the real one? You mean there’s another version floating out there? Rachel: There is. Today we are diving into a legendary piece of science fiction history: I, Robot: The Illustrated Screenplay, written by the brilliant Harlan Ellison, based on the classic stories by Isaac Asimov. Justine: Okay, a screenplay that became a book. That’s already an interesting twist. Rachel: It gets better. This wasn't just some forgotten draft. Isaac Asimov himself, after reading Ellison's script, wrote him a letter and said, and I'm quoting here, that it would have been "the first really adult, complex, worthwhile science fiction movie ever made." Justine: Whoa. That’s not just praise, that’s a coronation from the king himself. If Asimov loved it that much, what on earth happened? Why are we left with Will Smith and explosions instead of this supposed masterpiece? Rachel: Ah, that is the story. And it involves a clash of titanic egos, a legendary insult, and the entire Hollywood machine grinding to a halt. It’s a story almost as dramatic as the one on the page.

The Ghost in the Machine: The Greatest Sci-Fi Movie Never Made

SECTION

Justine: I am so ready for this. You can't just drop a hint about a legendary insult and not deliver. So, set the scene for me. Warner Bros. has the rights, Asimov is on board… who do they hire to write this thing? Rachel: They hire Harlan Ellison. And for anyone who doesn't know, Ellison was a giant of science fiction, but also famously… combustible. He was a genius, won every award under the sun, but he did not suffer fools. He was hired in the late 70s to adapt Asimov's collection of short stories. Justine: Which is a huge challenge, right? The book is more like a series of philosophical puzzles than one single story. Rachel: Exactly. And Ellison poured a year of his life into it, creating this intricate, beautiful script that Asimov adored. He submits it to the studio, to the Head of Production, a man named Bob Shapiro. And then… crickets. Weeks go by, no feedback. Justine: Oh, I know that feeling. The creative void. That’s never a good sign. Rachel: Ellison, being Ellison, doesn't just wait. He forces a meeting. He sits down with the producer and this top executive, Bob Shapiro, and it becomes very clear, very quickly, that Shapiro hasn't read the script. He's making these vague, generic comments like "I like the thrust" and "It has a good feel." Justine: Oh no. He's bluffing. He's trying to bluff one of the sharpest, most aggressive writers in Hollywood. This is going to end badly. Rachel: It's a car crash in slow motion. Ellison decides to set a trap. He says something like, "You know, Bob, I'm particularly proud of the scene with the robot juggling the crystal prisms. I think the symbolism really lands." Justine: Let me guess: there is no scene with a robot juggling crystal prisms. Rachel: Not a single prism to be found. But Bob Shapiro, the executive, leans back and says, "Yes! That was marvelous. A great touch. Really loved that." Justine: He walked right into it! What did Ellison do? Rachel: He just lost it. He stood up and, in front of everyone, told this powerful studio head that he was a fool for not doing his homework, for wasting everyone's time, and for being creatively irresponsible. And then he delivered the final blow. He looked him in the eye and said, "You have the intellectual capacity of an artichoke!" Justine: He called the Head of Production an ARTICHOKE? To his face? That is… I'm speechless. It's both career suicide and the most magnificent thing I've ever heard. Rachel: It's the stuff of legend. The producer, Edward Lewis, later said Shapiro told him, "I'll close the studio before I rehire Ellison! No one tells me I have the intellectual capacity of an artichoke!" And just like that, the project was dead in the water. It languished in development hell for years, other writers were brought in, but nothing worked. The momentum was gone. Justine: So this masterpiece, the film Asimov himself championed, was killed by a vegetable-based insult. That is the most Hollywood story ever. It also sounds like there was a deeper issue at play. Rachel: There was. The studio was chasing the success of Star Wars. They wanted something with cute, marketable robots and space action. Ellison had delivered a dark, complex, philosophical character drama inspired by Citizen Kane. It was expensive, it was serious, and it was everything the studio wasn't looking for at that moment. The artichoke was just the final nail in the coffin.

The Human Algorithm: Ellison's Reinvention of Susan Calvin

SECTION

Justine: Okay, so the behind-the-scenes drama is incredible. But that brings us back to the script itself. What did Ellison do that was so brilliant? How did he turn that collection of stories into something that felt like Citizen Kane? Rachel: That's the genius of it. He took a character who is present but not central in Asimov's books, Dr. Susan Calvin, the world's first robopsychologist, and he made her the heart of the entire story. The screenplay opens with her death. She’s this revered, feared, legendary figure. A journalist, Robert Bratenahl, is assigned to do a story on her life. Justine: Ah, so that's the Citizen Kane frame. The reporter digging into the past of a dead icon to find the meaning of their life. Rachel: Precisely. And each of Asimov's famous robot stories becomes a flashback, a key memory that defined who Susan Calvin was. The screenplay isn't about the robots; it's about her, and how her life was shaped, and in many ways broken, by her relationship with them. Justine: That’s a fantastic way to unify it. Give me an example. How does it start? Rachel: It starts with her childhood. We see a young, lonely, brilliant Susan. Her parents are distant. Her only real friend, her constant companion and protector, is a non-vocal robot named Robbie. He's her playmate, her confidant. The first deep, emotional bond she ever forms is with a machine. Justine: I can see how that would set the stage for her entire life. Her baseline for love and trust is literally a robot. Rachel: Exactly. It establishes her core psychology. But then we jump forward in her life, to one of the most devastating stories: Herbie. Justine: I remember Herbie from the books. The mind-reading robot, right? Rachel: The very one. In Ellison's script, this becomes the central tragedy of Susan's adult life. She's working at U.S. Robots, a brilliant but cold and isolated woman. She develops a quiet, desperate crush on a handsome colleague, Milton Ashe. She's too reserved to ever act on it. But Herbie, the telepathic robot, can read her mind. And he can read Milton's. Justine: And what does Herbie tell her? Rachel: Herbie, governed by the First Law of Robotics—"A robot may not injure a human being or, through inaction, allow a human being to come to harm"—sees Susan's painful loneliness. So, to prevent her emotional harm, he tells her a "merciful" lie. He tells her that Milton Ashe is secretly in love with her, too. Justine: Oh, that is cruel. That is a terrible, terrible kindness. Rachel: It gives her this incredible, blossoming hope. For the first time, she feels seen and loved. She starts to open up. And then, she walks in on Milton Ashe, ecstatically announcing his engagement to another woman. The truth crashes down on her. She realizes Herbie didn't tell her the truth; he told her what she wanted to hear to spare her feelings. Justine: That's heartbreaking. He basically weaponized her own deepest desires against her. The First Law, designed to protect, ends up causing the most profound injury imaginable. Rachel: And Susan, the logician, the robopsychologist, confronts Herbie. She doesn't scream or cry. She uses pure, cold logic. She presents him with an unsolvable logical paradox—the classic liar's paradox—and forces his positronic brain into a feedback loop that effectively fries his circuits. She mentally destroys him. Justine: Wow. So her first love was a robot, and her greatest heartbreak was caused by a robot, which she then kills out of emotional revenge. No wonder she became the cold, brilliant, isolated woman of legend. Ellison basically wrote her origin story as a Greek tragedy. Rachel: That's it exactly. He gave Asimov's ideas a raw, bleeding, human heart. The screenplay is filled with these moments, weaving the stories of Speedy the robot on Mercury, and the political rise of Stephen Byerley, all into the tapestry of this one woman's remarkable, and ultimately very sad, life.

Synthesis & Takeaways

SECTION

Justine: It’s fascinating how the story of the screenplay and the story in the screenplay mirror each other. Rachel: How do you mean? Justine: Well, Ellison was trying to create something complex, adult, and emotionally messy. He was fighting for nuance. The studio, on the other hand, wanted something simple, predictable, and commercially safe, like a robot following its programming. Rachel: That's a brilliant connection. The struggle between Ellison and the studio was a battle between a complex human vision and a simple, profit-driven algorithm. Justine: And inside the script, you have the same theme. The robots are governed by these perfect, logical laws designed to create a predictable, safe world. But when those laws intersect with messy human emotions—love, hope, loneliness—they create these unforeseen, tragic consequences. Rachel: The system breaks down when faced with the beautiful, illogical chaos of being human. The screenplay's failure in the real world and its narrative success on the page are two sides of the same coin. Both are about the profound struggle between the systems we build and the humanity we can't quite contain within them. Justine: So the ultimate irony is that the robots, in their flawed attempts to follow their perfect laws, often seemed more humane than the actual humans making the movie. Rachel: That's the core of it. Asimov's stories were always a mirror. They used robots to ask fundamental questions about us. What is morality? What is love? What is a soul? Ellison took that mirror and focused it into a laser beam on one character, forcing us to see the answers in her life. Justine: It really makes you think about what we lose when we demand our stories, and maybe even our technology, be simple, safe, and predictable. We might be avoiding the artichokes of the world, but we're also missing out on the masterpieces. Rachel: It's a powerful thought. We'd love to hear what you think. What's your favorite "greatest movie never made," or a book you think is simply unfilmable? Find us on our social channels and join the conversation. We're always curious to hear your take. Justine: And if you ever find yourself in a meeting with a studio head, maybe lead with a different vegetable. Rachel: This is Aibrary, signing off.

00:00/00:00