While many people in the creative industries fear AI could steal their jobs, Oscar-winning film director James Cameron is embracing the technology. Cameron is famous for the films “Avatar,” “Terminator,” and “Titanic.” Now he has joined the board of Stability.AI, a leading player in the world of generative AI.
In Cameron's Terminator films, Skynet is an artificial general intelligence that has become self-aware and is determined to destroy the humans who attempt to disable it. Forty years after the first of these films, the director appears to be switching sides and allying himself with the AI. So what's behind it?
Valued at around $1 billion, Stability.AI was, at least until recently, headquartered above a chicken shop in Notting Hill. It is famous for Stable Diffusion, a text-to-image tool that creates hyperreal images from text queries (or prompts) from its users. Now it's about AI-created videos.
Cameron seems to see her work as a potential game changer in the field of visual effects in films: “I was at the forefront of CGI over three decades ago and have stayed at the cutting edge ever since. Now the interface between generative AI and CGI imaging is the next wave,” he commented in a media release from Stability.AI.
Filmmakers complement the live-action reality they shoot with two types of effects: special effects (SFX) and visual effects (VFX). They occur in two different phases of film production. During filming, SFX are any physical effects used to create a spectacle – explosions, blood spatters, vehicle crashes, prosthetics, mechanical movement of sets.
During post-production, VFX are the digital systems that add new elements to the live-action film images – computer-generated imagery (CGI), compositing, motion capture rendering. They also combine images taken separately.
According to James Cameron, the intersection between generative AI and CGI image creation is the “next wave” in VFX.
Paul Smith Featureflash / Shutterstock
A recent development in film technology, Virtual Production, has incorporated some VFX techniques into filmmaking. So-called “games engines” are used – a technology that was developed for the creation of video games. Actors are filmed in front of sophisticated LED walls that depict dynamic, pre-produced virtual worlds around the actor.
The real-world physicality of SFX means that artificial intelligence will have very limited impact here. AI can have a transformative effect, especially in the VFX area. I will be speaking about the topic of deepfakes and AI in film at a public lecture on October 30, 2024: “Deepfakes and AI in film and media: Seeing is not believing”.
We are also exploring the topic as part of the Synthetic Media Research Network, a group I co-lead that brings together filmmakers, academic researchers and AI developers. I spoke to a member of this collective, Christian Darkin, a VFX artist who now works as Head of Creative AI for Deep Fusion Films.
He sees the impact of generative AI on VFX in creating endless choices in post-production. In the future, filming the actors will only be the beginning. “You're going to add the background later, you're going to change the camera angles, you're going to change the facial expressions, you're going to increase the emotion in the acting, you're going to change the voices, the costumes, the people's faces, everything,” Christian told me.
A major motivation for the film industry to integrate AI into VFX is simple: the cost of traditional VFX. If you've seen the end credits of a blockbuster movie, you know how many VFX technicians are employed there. Generative AI offers a more cost-effective way to achieve spectacular on-screen images, potentially without any loss of quality.
The result is that many VFX technicians will lose their jobs as a result. However, from the conversations I've had with people working in these roles, I feel that their high level of skill and technical knowledge means they are likely to move on to new roles in emerging technology areas.
The Ethics of AI Technology
Media creators now have a huge selection of generative AI tools at their disposal, offering new ways to create images, text, voices and music. However, a key question surrounding the technology remains to be answered: were these AI tools developed ethically?
Every generative AI tool, from ChatGPT to Midjourney to Runway, is built on a basic model that has been exposed to massive amounts of data, often from the Internet, to improve its work. This process is called “training.”
AI developers are building vast reservoirs of training data by deploying “crawlers,” bots that scour the Internet for useful material and download trillions of files for their own use. This may include books, music, images, spoken word and videos created by artists who retain copyright to their material.
Stability.ai has been involved in a copyright dispute in the UK courts. Getty Images, owner of a huge collection of images and photographs, is currently suing the company.
A former Stability.ai executive, Ed Newton-Rex, resigned in November 2023 because the company sought creative content to train the model without pay, claiming it was “fair use.”
Perhaps Cameron believes that the AI developers will win the court cases against them and continue on their technological path. I asked Stability.ai if, before Cameron joined the company, they had pulled any of his creative materials from the Internet to use as training data for their base models – and did they ask his permission?
Their response was: “We cannot comment on the source of Stability AI’s training data.”
Cameron's Terminator films warned of the potential catastrophic effects of fraudulent AI. But the director is now clearly convinced that he is now sitting on a winning horse.![]()
![]()
Dominic Lees, Associate Professor of Filmmaking, University of Reading
This article is republished from The Conversation under a Creative Commons license. Read the original article.