screen time

Soon You’ll Be Able to Make Your Own Movie With AI

Artificial intelligence isn’t about to change the movie industry. It already has.

Images generated by the AI platform Midjourney. Art: Johnny Weiss/Midjourney
Images generated by the AI platform Midjourney. Art: Johnny Weiss/Midjourney

There’s a new Knives Out movie on Netflix, and I still haven’t seen a few of this season’s awards contenders. But the film I most wish I could watch right now is Squid Invasion From the Deep. It’s a sci-fi thriller directed by John Carpenter about a team of scientists led by Sigourney Weaver who discover an extraterrestrial cephalopod and then die one by one at its tentacles. The production design was inspired by Alien and The Thing; there are handmade creature FX and lots of gore; Wilford Brimley has a cameo. Unfortunately, though, I can’t see this movie, and neither can you, because it doesn’t exist.

For now, Squid Invasion is just a portfolio of concept art conjured by a redditor using Midjourney, an artificial-intelligence tool that creates images from human-supplied text prompts. Midjourney was released into public beta over the summer and for months belched out mostly visual gibberish. “I was trying to make a picture of Joe Rogan fighting a chimp, and it just looked like nightmare fuel,” says the Reddit user, OverlyManlySnail, whose real name is Johnny Weiss. Then, in November, the software was upgraded to version four. It began effortlessly translating complicated suggestions (“DVD screengrab, ’80s John Carpenter horror film, an alien squid attacking a horrified Sigourney Weaver, blood everywhere, extra wide shot, outstanding cinematography, 16-mm.”) into imaginary film stills that look good enough to be real. Some of them look better than anything in Hollywood’s current product line: stranger, more vividly composed, seemingly less computer generated even though they’re completely computer generated.

Soon, Hollywood could be in direct competition with generative AI tools, which, unlike self-driving cars or other long-promised technologies that never quite arrive, are already here and getting better fast. Meta and Google have announced software that converts text prompts into short videos; another tool, Phenaki, can do whole scenes. None of these video generators has been released to the public yet, but the company D-ID offers an AI app that can make people in still photos blink and read from a script, and some have been using it to animate characters created by Midjourney. “In the next few years,” says Matthew Kershaw, D-ID’s VP of marketing and growth, “we could easily see a major movie made almost entirely using AI.” Someday, instead of browsing our Rokus for something to watch, we might green-light our own entertainment by pitching loglines to algorithms that can make feature-length films with sophisticated plots, blockbuster effects, and A-list human actors from any era.

One hurdle to this future is that whimsical user prompts are no substitute for good scripts. Somebody (or something) needs to tell the video generators what to generate for two hours. But progress is underway on that front, too, because it turns out that ChatGPT — the new AI chatbot that can write code, college essays, and instructional rap songs on how to change your motor oil — is also an aspiring screenwriter.

With Weiss’s permission, I asked ChatGPT to develop a plot for Squid Invasion. I described the concept images and told it to create an outline for the movie, which I’ll summarize: At a remote research lab in the ocean, scientists discover a species of alien squids, which are hyperintelligent and can regenerate their bodies after injury. The squids escape their containment tanks and kill several researchers. The humans fight back with guns and other weapons, but it only makes the squids angrier. The scientists destroy the lab with a reactor explosion that they hope will kill the squids too. The film ends with the survivors celebrating their narrow escape — and mourning their colleagues.

That may not pack much narrative surprise or subvert genre conventions, but it does imply that ChatGPT understands basic story logic in a way that eludes plenty of humans. It even, at my request, suggested a decent twist ending: Another alien race contacts the survivors and reveals the squids were a peaceful and misunderstood species.

What ChatGPT can’t do yet is write an actual screenplay. The software that powers most current AI language generators can process text of only 1,500 or fewer words, which makes it hard to produce coherent works of their own that are any longer. But after many failed attempts, I got ChatGPT to draft some of Squid Invasion’s first scene.

Dr. Samantha Carter

These squids are incredible.

 

Dr. James Jones

Yeah, they’re definitely something. But we need to be careful. These deep sea creatures can be dangerous.

 

Dr. Mike Smith

I agree. We need to study them carefullyand make sure they don’t pose a threat.

 

Dr. Carter

Oh no! The squids are attacking!

 

Dr. Jones

Grab the flamethrower.

Those lines are bad. But not so bad that I can’t imagine them being delivered in a perfectly enjoyable Gerard Butler movie. AI may never be Robert Towne, but with next-gen language bots expected next year, the writers of Black Adam should be nervous.

Some have argued that AI tools aren’t as clever as they seem, that they’re incapable of original thinking and can only parrot their training material. That may hinder them in some fields. But in Hollywood, shallow riffing on preexisting intellectual property is a cherished and lucrative skill. Some of the most acclaimed movies of 2022, including Top Gun: Maverick and Elvis, have the hermetically nostalgic tinge of AI creations.

A few filmmakers have already embraced the tech for certain applications. The director Scott Mann used machine learning in his 2022 thriller Fall, altering the actors’ mouths to eliminate swear words and avoid an R rating. It was used in next year’s Indiana Jones and the Dial of Destiny to make 80-year-old Harrison Ford look 45. South Park creators Trey Parker and Matt Stone recently landed a $20 million investment for their new start-up, Deep Voodoo, an entertainment studio that will provide low-cost deep-fake visual effects. And for James Cameron’s Avatar: The Way of Water, the FX studio Weta deployed AI to give Na’vi characters realistic facial muscles that move in concert. “In previous systems, if we wanted to change a character’s smile, we had to go in and move all the pieces, and it was a lot of work to keep it from looking rubbery,” says Weta senior visual-effects supervisor Joe Letteri. “This got us to a natural place much sooner.” Letteri doesn’t expect AI to generate any Avatar movies by itself, though, at least not soon: “We had 1,600 VFX artists working on this movie and another 1,600 people in live action. We worked on it for five years. You’re not going to get that from a logline.”

But Hollywood agencies and law firms are preparing for a future in which clients like Weaver could be unwittingly cast in some redditor’s fever dream. “These tools are exciting, but what’s most important to us is that the companies behind them respect the talent and get consent for names, images, and likenesses,” says Joanna Popper, CAA’s chief metaverse officer. “We want to protect creators so that they have the opportunities to monetize their work and images and so others aren’t able to exploit them.”

The names of non-consenting artists could be banned as user prompts by AI generators. But that wouldn’t change the fact that many of the tools have already been taught by those artists’ work. The reason Squid Invasion is able to nail the aesthetics of sci-fi from the late ’70s to early ’80s is because Midjourney’s training- data likely includes stills from real movies of that era, among millions of other copyrighted images. “We’re talking about software that learns from content but doesn’t necessarily present the content that it learned from,” says Jeffrey Neuburger, an IP lawyer at Proskauer Rose LLP. “So who owns the copyright for the work it creates? This raises questions of fair use and also rights of publicity. This is one of those situations where the law is going to have to catch up” to new technology.

In other words, we need to study these tools carefully and make sure they don’t pose a threat. Grab the flamethrower.

More From 'screen time'

See All
Soon You’ll Be Able to Make Your Own Movie With AI