How SAG-AFTRA’s AI Road Map Works in Practice
Last year will likely go down as the moment that artificial intelligence came of age. In November 2022, OpenAI released the first iteration of its chatbot, ChatGPT. The world spent 2023 amazed by the potential of ChatGPT and grappling with its consequences.
But, in the entertainment industry, the introduction of ChatGPT upended the 2023 collective bargaining negotiations between the major motion picture companies and the unions representing writers, directors and actors. At the beginning of 2023, AI was at best a peripheral bargaining topic, far behind compensation and residuals for high-budget streaming productions. By the end of 2023, some union members viewed AI as the beginning of the end, a doomsday scenario along the lines of Judgment Day in The Terminator franchise.
More from The Hollywood Reporter
Hollywood Labor Backs AI Transparency Bill That Could Offer Firepower to Creators
Workers at Cinema Village, One of New York's Oldest Art Theaters, Launch Unionization Drive
IATSE Local 871 Reaches a Tentative Agreement on Craft-Specific Issues
After completion of the 2023 film and TV bargaining cycle — punctuated by the writers and actors strikes last year — we have the first entertainment collective bargaining agreements (CBA) addressing AI. The CBA with the most robust AI provisions is the 2023 SAG-AFTRA memorandum of agreement (MOA). While this agreement governs film and television production, SAG-AFTRA touches many different types of productions all around the world, from film and TV to commercials, video games and new media.
The AI provisions in the 2023 MOA will likely have a broad impact on content industries worldwide, thus warranting a closer read.
The Stakes
In conference rooms around the world (both real and virtual), the focus is not yet on what comes out of AI but what goes in. The currency of the moment is the AI dataset — what is fed into an AI model — and there are armies of lawyers formulating what goes into the model and what stays out. These lawyers are looking at potential AI inputs in light of a web of laws including contract, copyright, trademark and rights of publicity.
Some companies are also looking ahead to an AI backlash and resulting regulation. Some of these companies are considering AI datasets through both a legal and an ethical lens.
Many content companies have been quietly using some form of AI for years, primarily to generate visual effects. If you’ve seen a film where a body moves in a not-quite natural way, or where a character seems not quite human, that’s probably AI. But the popularization of AI has changed the stakes. And the technology is evolving quickly (see: OpenAI’s Sora, which can create hyperrealistic clips from a couple of text prompts).
Perhaps the most important point is the simplest: The use of AI in motion pictures is now a mandatory collective bargaining topic when dealing with unions in the entertainment industry. If you are working in unionized entertainment, you will need to either negotiate an AI provision with a union or follow the AI provisions already in place.
Going forward, media companies will almost certainly build out two categories of AI: models built on datasets that can be used without additional clearances or payments to third parties, and models requiring third party consents and additional payments. To borrow Getty Images’ terms for licensing photos and video, there will be “Royalty Free” AI and “Rights-Managed” AI. Where an AI dataset has input governed by a CBA, its output will fall into the Rights-Managed category. What the entertainment unions are negotiating for in this area boils down to two words: consent and compensation.
Turning back to the 2023 MOA, its provisions divide AI into two categories: Generative Artificial Intelligence (GAI) and Digital Replicas.
How Data Sets Factor in Under the Deal Terms
Let’s start at the end: The 2023 MOA does not require additional payments for including footage or voice recordings in a GAI dataset. But this doesn’t mean a producer will always be able to input footage into an AI model without consent. In the entertainment industry, CBAs such as the 2023 MOA only provide a contractual floor. Performers with sufficient clout can bargain for better terms, including additional compensation or to be excluded from datasets entirely.
With respect to data sets, Subsection B of the 2023 MOA’s new GAI provision reads: “The Producers agree to meet regularly with the Union during the term of the [CBA] to discuss appropriate remuneration, if any, with respect to photography and/or soundtrack recorded under these Agreements or any predecessor Agreement that is used to train a GAI system for the purpose of creating Synthetic Performers for use in new motion picture content.” As of now, payment for inclusion in a dataset is not required. The parties effectively punted on this issue.
The AI skeptic may ask: Why? Shouldn’t I be paid extra to have my performance included in an AI model? The producer’s response is twofold. First, your employer is already paying you for your work. In fact, the most recent CBA gave you a big raise. This is now just part of your work.
Second, the entire tech industry is creating datasets without paying anyone. These datasets are largely scraped from the internet, social media and other publicly available information. If you’ve been on social media, you’re already in one of those datasets. If tech companies are putting you in their datasets for free, this thinking goes, the people who pay you should be able to include you in their datasets without paying you more. (At least, that’s the line from employers so far.)
The debate over dataset compensation will continue as GAI evolves. As discussed below, while most actors won’t be paid for GAI inputs, they will be paid for certain GAI outputs.
How Synthetic Performers Will Be Governed
The 2023 MOA defines a “Synthetic Performer” as a digitally created asset that appears to be a “natural performer who is not recognizable as any identifiable natural performer,” is not voiced by an actor, and which is not a Digital Replica or the result of an actor employment agreement. In other words, a Synthetic Performer is a character created through GAI. Under the 2023 MOA, there are two types of synthetic performers: Generic and Recognizable.
If a producer uses any type of Synthetic Performer, the producer must give notice to SAG-AFTRA. SAG-AFTRA will then bargain over appropriate consideration “if any, if a Synthetic Performer is used in place of a performer.” That’s it with respect to a “Generic” synthetic performer. One can expect that there will be future negotiations, and perhaps labor arbitrations, over whether a producer owes SAG-AFTRA money, and how much, when it uses a Generic GAI character.
There are additional obligations for using “Recognizable Synthetic Performers.” A Synthetic Performer is recognizable if it includes an actor’s “principal facial feature (i.e., eyes, nose, mouth or ears)” that is requested through a “prompt to a GAI system.” For these recognizable AI characters, the producer is required to bargain with the performer and obtain their consent. So, if you ask an AI system to generate Will Smith’s facial features, you will need Will Smith’s permission and will have to negotiate with his agent. And if you ask for a character with a mix of Will Smith’s and Owen Wilson’s facial features, you’ll have to negotiate twice.
This distinction between recognizable and generic performers tracks back to prior SAG-AFTRA agreements. For many years, producers have been required to bargain before reusing footage in another project. You can’t just take footage from Iron Man and put it in The Avengers. But this “separate bargaining” obligation only applies “if the performer is recognizable and, as to stunts, only if the stunt is identifiable.” This principle of being “recognizable” has now been carried forward to AI.
In short, recognizable AI characters will require consent and bargained-for compensation. Generic AI characters remain up in the air, but the required bargaining on this topic is sure to be heated.
Actors vs. “Digital Replicas”
Another category of AI output contemplated by the 2023 MOA is a “Digital Replica.” There are two categories of digital replicas: an “Employment Based Digital Replica,” which is created by a producer, or an “Independently Created Digital Replica,” which is created by the actor. Both digital replicas are meant to simulate “the voice and/or likeness of an identifiable natural performer, performing in the role of a character (and not as the natural performer himself/herself).” One example is creating a digital replica of a young Mark Hamill so you can put Luke Skywalker in The Mandalorian.
If a producer intends to use an Employment Based Digital Replica, the producer must provide advance notice and obtain the actor’s “clear and conspicuous” consent in a signed writing that is separate from the employment contract. There are rules governing how an actor is paid for that replica. But the most important question is: What happens if the actor won’t consent?
In most instances, actors will consent to what is already commonplace, using an AI replica to fix scenes that the actor shot, or for which the actor was paid to perform. The producer can go beyond this by seeking specific consent if the replica is used in the same project but “in new photography or soundtrack not previously recorded by a performer.” So, in most instances, if you want to use an AI replica to add new lines or a new scene for an actor you hired, the actor needs to approve. That consent must include “a reasonably specific description” of how the replica will be used. Importantly, once provided, these consents will continue after the performer’s death. A producer can also obtain consent from a deceased performer’s authorized representative. If the producer can’t identify the representative, it can contact SAG-AFTRA to seek consent.
If the producer wants to use an Employment Based Digital Replica for another project, including a sequel or prequel, the producer must once again obtain the actor’s consent and negotiate separate pay for that use. Again, consent and compensation.
For an Independently Created Digital Replica, or a replica created by the actor, a producer must negotiate with the Performer or his or her authorized representative. Importantly, pension and health contributions must be remitted for these uses “as applicable.” This begs the question: If you are negotiating to use an AI model of a deceased actor, do you have to pay for health care even if the actor is no longer alive?
Digital Alteration
Digital alteration is used all the time in films, whether to erase acne or fix the lighting in a scene. This alteration may or may not involve AI. A producer does not need an actor’s consent to make such alterations “when the photography or soundtrack of the performer remains substantially as scripted, performed and/or recorded.” So you don’t need consent to erase crow’s feet, but you may need consent to put the actor in the background of a new scene. This consent can be included in the employment contract so long as the provision is separately initialed.
The Bottom Line
The phrase you will hear time and again is consent and compensation. Employment contracts for actors will now include consent provisions for AI use as a matter of course. Attorneys should draft and read these provisions carefully.
In some instances, you will need to obtain a separate contract or consent to insert an actor into a new scene or to change their dialog. And if you recorded an actor for one project, you will need the actor’s consent and an additional payment to use an AI version of the actor in another project.
The AI provisions in the 2023 MOA are already impacting SAG-AFTRA’s bargaining in other areas, and are likely to be referenced in acting contracts around the world. And this is only the beginning. The entertainment industry’s collective bargaining over AI will continue for years to come.
Best of The Hollywood Reporter