For horror fans, Late Night With the Devil marked one of the year’s most anticipated releases. Embracing an analog film filter, the found-footage flick starring David Dastmalchian reaped praise for its top-notch production design by leaning into a ’70s-era grindhouse aesthetic reminiscent of Dawn of the Dead or Death Race 2000. Following a late-night talk show host airing a Halloween special in 1977, it had all the makings of a cult hit.
But the movie may be remembered more for the controversy surrounding its use of cutaway graphics created by generative artificial intelligence tools. One image of a dancing skeleton in particular incensed some theatergoers. Leading up to its theatrical debut in March, it faced the prospect of a boycott, though that never materialized.
The movie’s directors Cameron and Colin Cairnes defended the AI usage, explaining the art was touched up by human hands. In a statement, they said, “We experimented with AI for three still images which we edited further and ultimately appear as very brief interstitials in the film.”
Less than a month later, five images suspected to be generated by AI teasing postapocalyptic scenes in A24’s Civil War sparked similar outrage by a segment of fans. There were a few telltale signs that the graphics were AI-created in landmark accuracy and consistency blunders: The two Chicago Marina Towers buildings in one poster are on opposite sides of the river; in another, a shot of wreckage shows a car with three doors.
In response, a reader on A24’s Instagram post wrote that the backlash to Late Night was “more than enough to make transparently clear to everyone: WE DO NOT WANT THIS.”
But in the entertainment industry, the Pandora’s box of AI has likely already been unleashed. Behind closed doors, most corners of production, from writers’ rooms to VFX departments, have embraced generative AI tools. For every project that has faced blowback for using AI in some part of the production pipeline, there are dozens more that have quietly adopted the technology.
“There are tons of people who are using AI, but they can’t admit it publicly because you still need artists for a lot of work and they’re going to turn against you,” says David Stripinis, a VFX industry veteran who has worked on Avatar, Man of Steel and Marvel titles. “Right now, it’s a PR problem more than a tech problem.”
“Producers, writers, everyone is using AI, but they are scared to admit it publicly,” agrees David Defendi, a French screenwriter and founder of Genario, a bespoke AI software system designed for film and television writers. “But it’s being used because it is a tool that gives an advantage. If you don’t use it, you’ll be at a disadvantage to those who are using AI.”
One of the reasons for the backlash to AI usage in Late Night and Civil War could be the precedent it appears to set. Hiring or commissioning a concept or graphic artist would’ve been a negligible cost for the productions involved. If companies are willing to use AI to replace such peripheral tasks — in the case of Late Night and Civil War, jobs that could have been accomplished by anyone on their production design teams — what positions are next? Writers? VFX artists?
“Most writers who have tried out AI have found it’s not a very good writer,” says David Kavanagh, executive officer of the Federation of Screenwriters in Europe (FSE), a group of writers guilds and union representing more than 8,000 writers across 25 countries. “So I don’t see it replacing us yet, but the impact on other areas of the industry could be very damaging.” He points to areas like kids’ animation and soap operas, where there is a lot of “repetition of similar situations by the same set of characters,” as sectors that could be hard hit.
The displacement of labor by lower-level workers in Hollywood likely plays a part in which AI uses are seen as acceptable, and which are beyond the pale. Much of the discourse around the issue is filtered through the lens of Hollywood’s historic dual strikes last year. The utilization of AI tools in Civil War and Late Night meant artists missed out on work.
Some sectors of the industry are already threatened with extinction. “Dubbing and subtitling employment in Europe is finished,” says Kavanagh, pointing to AI technology that can produce lip-synced dubs in multiple languages, even using versions of the original actor’s performance. “It’s hard to see how they will survive this.”
In Cannes on Saturday, indie producer/distributor XYZ Films will present a sizzle reel of AI-translated trailers of international films, including Nordic sci-fi feature UFO Sweden, French comedy thriller Vincent Must Die and Korean action hit Smugglers, which showcase TrueSync dubbing technology from L.A.-based company Flawless. Flawless and XYZ are pitching the tech as a chance for hit international films to cheaply produce a high-quality English-language dub that will make them more attractive for the global market. Flawless, XYZ Films, and Tea Shop Productions plan to roll out UFO Sweden worldwide in what they are calling the first large-scale theatrical release of a fully translated film using AI.
Meanwhile, Putin, a new political biopic from Polish director Besaleel, which is being shopped to international buyers in Cannes, uses AI tech to re-create the face of Vladimir Putin over the body of an actor with a similar build to the Russian leader. Besaleel says he plans to use the same technology, developed in-house by his postproduction company AIO, to create deepfake actors to play extras and supporting roles.
“I foresee that film and TV productions will eventually employ only leading and perhaps supporting actors, while the entire world of background and minor characters will be created digitally,” he says.
In Hollywood, the specter of AI casts a daunting shadow. A study surveying 300 leaders across the entertainment industry issued in January reported that three-fourths of respondents indicated that AI tools supported the elimination, reduction or consolidation of jobs at their companies. Over the next three years, it is estimated that nearly 204,000 positions will be adversely affected. Concept artists, sound engineers, and voice actors stand at the forefront of that displacement. Visual effects and other postproduction work were also cited as particularly vulnerable.
There is also an imbalance in the resistance to AI usage in Civil War and Late Night but not, for example, Robert Zemeckis’ upcoming Miramax movie Here, which will feature a de-aged Tom Hanks and Robin Wright. Their transformations were accomplished using a new generative AI-driven tool dubbed Metaphysics Live.
Deploying AI to allow actors to play younger or older versions of themselves could entrench A-list talent because they’re now suddenly eligible to play roles of all ages. Like with graphic artists who could’ve lost out on work in Late Night, a young Tom Hanks look-alike similarly could’ve missed an opportunity to be cast in a major studio movie. Why hire Sophie Nélisse to play a younger version of Melanie Lynskey’s Shauna in Yellowjackets when the production can just de-age the established star?
But where many see a threat, some see an opportunity. “We see AI as a tool and one we think will unlock creativity and opportunity, that will create jobs, not eliminate them,” says Motion Picture Association CEO Charles Rivkin, speaking to The Hollywood Reporter in Cannes, so long as guardrails are in place and copyright is protected.
Rivkin, the former CEO of The Jim Henson Co., notes that the late, great Muppets creator was always on the cutting edge of technology. “If Jim were alive today,” says Rivkin, “he’d be using AI to do amazing things, using it to enhance his storytelling.”