[The following article and
two sidebars appeared in the Summer 2022 issue of SAG-AFTRA (11.3), the quarterly magazine of the
union that represents actors in television and film.
[Readers should note that I
have published past posts on the subject of the use of computer and advanced
technologies in TV, film, and theater. I
refer to “Theater and Computers,” posted on Rick On Theater on 5 December 2009, and “Computers and Actors,” 4 and
7 October 2021.
[For those who aren’t part
of the performance industry in the United States, SAG-AFTRA, the result of the
merger of two formerly independent unions, stands for Screen Actors
Guild-American Federation of Television and Radio Artists.]
Imagine waking up to
find you are the face of a new advertising campaign — and it’s a product you
don’t want to be associated with. That was played for laughs on an episode of Friends,
when Joey did some modeling and then, unbeknownst to him, ended up the face of
an STD campaign [“The One Where Underdog Gets Away”; Season 1, Episode 9; 17 November
1994].
As technology has
evolved, artificial intelligence-powered software has made it possible to
create realistic audiovisual, video and audio content known as “deepfakes.” It
makes the above scenario not only possible, but a real threat to those who sign
broadly written non-union contracts that allow for unfettered use of a
performer’s image or voice.
In 2018, SAG-AFTRA
magazine reported on the growth of digital replicas’ threats and opportunities.
At that time, it was still a relatively niche technology, but in the last few
years it has gone mainstream. Nonconsensual deepfakes remain a problem — one
the union remains vigilant about — but the underlying technology has many
legitimate uses that can provide exciting new opportunities for members.
AI-Generated Content Is Growing
In recent years,
there has been an explosion in the number of artificial intelligence, or “AI,”
content technologies, and the quality of AI-generated content has improved
exponentially.
AI tech has been
used in large- and small-budget entertainment projects to virtually age and
de-age characters in a way that is cleaner, cheaper and more believable than
traditional visual effects, and without countless hours in the makeup chair. AI
can simplify performance capture, potentially eliminating the need for capture
suits and head rigs altogether. It can even be used to enhance the work
performed by stunt performers, which can provide safety benefits.
In dubbing and ADR
[presumably automated dialogue replacement, a post-production process in
filmmaking], AI technologies can help match an actor’s mouth and facial
movements to the dialogue they are speaking. It can also be used to dub the
films themselves. In the context of projects originally produced under
SAG-AFTRA agreements, it could open new revenue opportunities by providing
members an opportunity to negotiate for their voices to be used in the
foreign-language release. On the other hand, distributors of foreign content
who would otherwise hire SAG-AFTRA members to do dubbing work might find it preferable
to use AI-generated audio of the original actors. Although this has not yet
happened on a widespread scale, there are companies proposing these business
models as the technology improves, so the union is monitoring this closely.
Outside entertainment,
AI-generated audio can be used in digital assistant devices, customer service,
speech assistance and countless other applications, opening new areas of work
for professional voice performers.
AI technologies have
brought historic figures back to “life” in education and museum settings,
typically with the help of an actor who provides the performance capture that
animates the digital person.
This field is full
of exciting innovations, and a lot of performers are eager to work in these new
areas and potentially generate new income through their AI voice or avatar, but
it’s important to understand both the technology and the pros and cons of
working with it.
“Technological
innovations have historically provided incredible new opportunities for our
members,” said SAG-AFTRA Executive Vice President Ben Whitehair. “But we must,
as we always have, be deeply mindful of the associated risks, and ensure that
our digital performances and likeness are protected.”
Know What You Are Agreeing To
Anytime you grant
rights to use your name, image, likeness and voice, you should have a clear
contract in place governing the use. This is even more important in the digital
context. But, even on traditional entertainment projects, performers are often
asked to grant rights to use their voice, likeness, and performance well beyond
what is necessary for the specific project.
You might have heard
about a lawsuit filed by a Canadian voice actor against the company behind
TikTok. The performer had done voice work for a Scotland-based company, but the
voice files were allegedly used without her consent in the popular app. The
case illustrates the risks for actors, particularly when working without the
protection of the union behind you. Being branded as an app’s voice and being
involuntarily associated with content that you cannot control can impact your
image and ability to attract other voiceover work.
The nonunion AI
contracts SAG-AFTRA has seen have very one-sided terms and are often with
companies based in foreign countries. Many of these contracts give broad rights
to use your likeness or voice irrevocably and in perpetuity — this means they
have those rights forever and you cannot cancel the permission. There typically
is no case-by-case approval over how your digital self is used, and no form of
residuals or use-based payments no matter how long or widespread the use is.
“There are contract
templates floating around that AI companies and industry players claim are
performer-friendly, but a close look at the terms reveals a lack of crucial
protections,” said Senior Assistant General Counsel, Compliance and Contracts
Danielle Van Lier [see a related article by Van Lier below].
Rights of publicity
— the laws that protect your name, voice, image and likeness — can potentially
help against unauthorized uses of your digital self. These laws and others do
not provide a remedy when you sign a contract granting away the rights. Without
a union contract covering your work, your only potential recourse is costly
litigation.
The entertainment
and media industry is always evolving, and as technology advances, it seems
that the pace of change is ever-increasing. SAG-AFTRA members and staff are
continually working with tech firms, attending conferences and staying up to
date on all the latest information in this emerging field to ensure members are
protected.
“Artificial
intelligence is opening new frontiers in digital manipulation, and while it is
new territory, it doesn’t have to be scary, as long as we stay informed about
the potential hazards,” said SAG-AFTRA President Fran Drescher. “As we move
into this bold new future together, your union will be standing by your side.”
*
* * *
SAG-AFTRA and AI
SAG-AFTRA has been
working on issues relating to digital avatars and voices since long before AI
was being used to create them.
THE UNION has several staff members with considerable experience and expertise
on name, image and likeness rights, as well as on AI technology and its
applications, deepfakes and other related topics. They have presented to a
global audience on these subjects, hosted panels and discussions with experts
in the field, and have written and been interviewed for numerous articles
related to AI. SAG-AFTRA also participates in a multi-union workgroup on AI
with British Equity and ACTRA [Alliance of Canadian Cinema, Television and
Radio Artists].
Anticipating the
rise of digital replicas, SAG-AFTRA has added or negotiated language into many
of its promulgated and collectively bargained agreements prohibiting the
creation or use of digital replicas without both the union’s and the
performer’s consent, including those covering audiobooks, video games,
podcasts, commercials and corporate/educational content.
“SAG-AFTRA HAS been advocating for rights of publicity and
name, image and likeness rights for decades. This includes supporting critical
legislation as well as writing amicus briefs in cases that could impact how
laws relating to these rights are interpreted,” said SAG-AFTRA General Counsel
Jeff Bennett. SAG-AFTRA drafted and was instrumental in California’s and New
York’s civil laws against unauthorized digital nudity, as well as the passage
of New York’s new right of publicity law, which includes prohibitions on using
digital avatars of deceased performers.
FOR SEVERAL years, SAG-AFTRA has been in conversation with AI technology companies
about the ethical use of this technology and the fair compensation and
protection of performers who allow their voice, image or performance to be used
in the development and use of an AI voice or avatar.
“SAG-AFTRA is
committed to ensuring that our members’ rights are safeguarded and that they
are paid what they deserve whenever their work is exploited, regardless of the
technology employed or the nature of the exhibition platform,” said Senior
Director, Strategic Initiatives Sue-Anne Morrow.
All SAG-AFTRA
contracts for work in the AI space include these critical terms:
• Safe storage of the performer’s voice,
likeness and performance, and the products and content created from them.
• The right to consent — or not consent — to
uses.
• Explicit limitation on use of the content.
• Appropriate payment for use of the content.
• Any exclusivity must be clearly noted and
fairly compensated.
• The right for a performer to control or opt
out of continued use and production.
*
* * *
What You Can Do
SAG-AFTRA is
actively engaging in discussions with companies creating AI content, and has
crafted contracts that are relevant to this new work area, are easy to use and
provide protections for both the performer and the employer. But your help is
critical in establishing a strong foothold in these emerging spaces.
Don’t work off the card: Many AI companies have websites through
which anyone can submit their voice or likeness and sign up to be an AI
“spokesperson.” Working in this space without a SAG-AFTRA contract is not only
a dangerous move for a professional performer, it also impedes the union’s
efforts to set fair terms and protections. [“Off the card” refers to the union
membership card.]
Let your union help: If you are approached to do this kind of
work, ask your employer to consider hiring you under a union contract.
SAG-AFTRA staff is happy to talk to them and make the process of becoming a
signatory simple and easy.
Talk to your peers and students: Let your peers know that their best
protection, when working with AI technology, is a union contract. If you teach
classes to or mentor actors who are early in their careers, warn them of the
risks discussed in this article.
Communicate with your representatives: Ask your professional representatives if
they are current on the technology and understand the risks associated with it,
and let them know that you aren’t interested in venturing into this space
without your union behind you.
*
* * *
[The article above from SAG-AFTRA magazine included a note at the end that SAG-AFTRA
Senior Assistant General Counsel, Contracts and Compliance Danielle Van Lier
provided guidance for lawyers who are representing performers in connection
with AI-generated content in the May 2022 issue of Los Angeles Lawyer
magazine. I’ve decided to append that
article to this Rick On Theater post.]
Practice Tips:
PROTECTING ARTISTS’ RIGHTS IN THE AGE OF AI
by Danielle S. Van Lier
[Van Lier’s article
appeared in the May 2022 issue of Los
Angeles Lawyer (45.3), the magazine of the Los Angeles County Bar
Association.]
Two years ago, in
this magazine, another article was published about the rise of deepfakes and
their potential for abuse. At that time, California had recently enacted Civil
Code Section 1708.86, which created a civil cause of action for individuals
who, without consent, are digitally depicted as “giving a performance they did
not actually perform” in “any portion of an audiovisual work that shows the
depicted individual performing in the nude [as defined] or appearing to engage
in, or being subjected to, sexual conduct.” The article discussed the potential
harm to, among others, performers when they are involuntarily depicted in the
nude or as engaging in sexual conduct without their consent. A lot has changed
in that short time.
“Deepfake,” a
portmanteau [a blend of words in which parts of multiple words are combined
into a new word] of “deep learning” and “fake,” has become the most prevalent
term used to describe audiovisual and audio content created using artificial
intelligence (AI). These videos typically depict someone doing something they
did not do, or saying something they did not say. The term originated in connection
with nonconsensual pornography and, from its onset, some have questioned
whether the term is overused and its definition too amorphous. For purposes of
this article, use of the term “deepfake” is limited to nonconsensual content or
that created with the intent to deceive, while using “AI-generated” to refer to
content created consensually.
Deepfakes remain a
problem and a threat. No matter how seemingly benign, nonconsensual deepfakes
can harm the individual whose voice and/or likeness is used. These concerns go
beyond nonconsensual sexual content to uses in a commercial or even creative setting.
It can harm the depicted individual’s reputation, mislead viewers and
consumers, or foreclose job opportunities.
For all the
potential harm deepfakes cause, the underlying technology has many legitimate
use cases. However, even authorized use can go too far if one is not careful in
the contract process. SAG-AFTRA has been closely watching the development of
these new technologies and the agreements being used in the space. A number of
issues have surfaced under this review, which have resulted in the formulation
of some questions to ask if a client is approached to work on an AI-project.
Growth of AI-generated Content
There has been an
explosion in the number of AI technologies, like the ones used to create
deepfakes, and the quality of AI-generated content has improved exponentially.
Companies offering AI-generated people and audio have proliferated in recent
years.
The technology has
potential for positive applications. In the entertainment industry, for
example, it can give independent producers with lower budgets some of the same
capabilities as the major motion picture studios. It can help match an actor’s
mouth and facial movements to dialogue in foreign-dubbed films, or even to dub
the films themselves. In a recent episode of The Book of Boba Fett [streaming
Star Wars series on Disney+, 2021-present], the effects team reportedly
used AI to de-age a character quite effectively. This came on the heels of
criticism over poor de-aging effects using more traditional techniques in The
Mandalorian [source series of Boba Fett, Disney+, 2019-present].
Outside the
entertainment industry, AI-generated audio can be used in digital assistant
devices or to allow those who have lost their ability to speak to communicate
in their own voice. It can even bring historic figures “back to life.”
In the last few
years, the technology has advanced from requiring hours of processing time,
hundreds of photographs or video samples, and a computer with reasonably
advanced graphics capabilities, to something that can be created on your cell
phone with a single selfie. Last August, reports surfaced about a tool that can
even create deepfakes in real-time for streaming video. It’s likely that in the
short time between the writing and publication of this article, additional
technologies will be released.
Consent, consent, consent
Consent is
particularly important in the context of digital humans, both in the original
creation and any subsequent uses. The use cases described above require the
consent of the person depicted, particularly in the case of professional
performers, whose likenesses and voices are key to their livelihood. Civil Code
Section 1708.86 provides a civil cause of action for non-consensual use of
AI-generated sexual content. However, what about non-sexual content?
Last May, a Canadian
voice actor sued ByteDance, Inc., the company behind TikTok, alleging that the
company had used her voice in its text-to-speech tool. According to the
complaint, Beverly Standing had performed voice work for a Scotland-based
company “purportedly for Chinese translations,” and apparently without a
written contract. The voice files were allegedly obtained by ByteDance and used
for TikTok’s text-to-speech feature. Standing expressed concern that being
branded as TikTok’s female voice, and being involuntarily associated with
content she felt could reflect poorly on her image, would impact her ability to
attract voiceover work, particularly for commercials, because she already would
be associated with TikTok.
Standing sued on
multiple theories relating to the unauthorized use of her voice, including
violation of her right of publicity, false endorsement under the Lanham Act,
and multiple other claims relating to unfair competition. There is precedent
holding that the unauthorized use of an individual’s voice and/or digital
likeness gives rise to at least a right of publicity claim, although the false
endorsement and unfair competition claims might be more difficult in
noncommercial uses.
Although this case
ultimately settled, it illustrates some of the risks for both actors and
content creators that are inherent in this space. More importantly, it
illustrates the significance of clear contracts, particularly why it is
important for creators to obtain consent and why performers need to carefully
consider the scope of the consent.
Beware Granting Rights
An increasing number
of reports and inquiries have been received by SAG-AFTRA from performers,
agents, attorneys, and even other unions, regarding terms they are seeing in
contracts for digital scanning and audio recording for AI-content. Even on
traditional entertainment projects, performers are being asked to sign blanket
releases granting rights to their voice, likeness, and performance well beyond
the scope of what is necessary for the specific project.
Attorneys and others
representing performers, models, or any other person for audio or performance
capture should pay particularly close attention to these contract terms. This
is as true if the contract is for traditional entertainment projects utilizing
digital scanning techniques as it is for work specifically in the AI space.
Many of the
companies currently working in the AI voice and video space are offering
standard form contracts with very one-sided terms, often governed under the law
of the foreign country in which the company is based. The contracts tend to
have blanket grants of rights to use the performer or model’s likeness and
voice irrevocably and in perpetuity. There is no approval right and no form of
residuals or use-based payments. Some ask that the performer or model indemnify
the company. Furthermore, some have morals clause with vague language.
Following are some
questions to consider when reviewing clients’ contracts (both old and new) in
light of these new innovations. Many of these questions seem innocuous or
obvious, but they take on new meaning in this evolving space.
• What
rights has the client granted or are being granted with regard to the use of
the client’s voice or likeness? Does the grant of rights allow use beyond the
current project? Do the intellectual property rights in the character allow use
of voice or likeness in subsequent works?
It has always been
important to have clear rights grants, including appropriate fences around use,
in contracts that grant rights in a performer’s voice or likeness. The Standing
case illustrates how much more important it is in the context of work using AI.
It was largely the lack of a contract covering these terms that gave rise to
multiple claims; an over-broad rights grant might have precluded them.
Actor contracts
typically have a provision allowing the use of the performer’s likeness in
character for purposes of merchandising. As expected, the producer retains all
rights in the character. However, the increasing use of technologies that allow
the creation of characters that are wholly digital, or even digitally enhanced,
means attorneys need to pay closer attention to how likeness and voice rights
grants are drafted, both in connection with the character and in merchandising.
The scope of any
provision granting rights in a client’s voice or likeness, especially those
relating to merchandising, is all the more important with the rapid growth of
the metaverse and associated technologies. As the entertainment industry starts
to experiment with nonfungible tokens—or, in common parlance, NFTs—attorneys
should be sure their clients are protected from exploitation and have the right
to control their likenesses in these new spaces.
These questions are
of particular importance if the performer is being digitally scanned or
providing likeness or voice for AI use, when digital assets are being created
and will be owned and controlled by the producer. To the extent possible,
counsel should seek to limit usage of the assets only to the current project,
so they cannot simply be reused and repurposed without appropriate
compensation.
Of note, if the work
is done under a SAG-AFTRA agreement, SAG-AFTRA takes the position that any
reuse of scanned content created for one project and used in a subsequent
production falls within the reuse provision of the applicable agreement.
SAG-AFTRA also aggressively objects to scanning contracts and other likeness
and voice grants that extend beyond the individual project for which they are
intended; with limited exceptions, the union’s agreements do not allow the
producer to obtain reuse consent at the time the work is done and it must be
separately bargained. These are important protections for performers, ensuring
they are fairly compensated for their work and can control how their voice and
likeness are used in the future.
• If a
client is being scanned for motion or facial reference, such as for animation
or video games, does it allow use beyond reference? For example, can the
client’s likeness be used in the end product?
Actors, particularly
those doing voiceover, are sometimes asked to do facial or other performance
capture, ostensibly so that animators working on the project can capture their
movements for character reference. Nevertheless, actors have reported that the
use goes beyond simple reference to use of their likeness in connection with
their characters. Performance capture technologies that are AI-driven can now
do this by simply recording a client’s performance as it is delivered, without
complicated rigging. Even if the employment contract is silent as to this
point, counsel should look to ensure that a client is protected from this
practice.
• Does the
contract allow digital manipulation of a client, whether through AI or
otherwise? Can the footage, itself, be manipulated even if the client cannot?
Digital manipulation
goes beyond aging or de-aging a character, to things like depicting the person
in a scene in which he or she did not perform, possibly even in a way the
person would have objected to had he or she been present. California Civil Code
Section 1708.86 provides a cause of action if that depiction involves nudity
and sexual situations, and it would not apply to other manipulations. In the
context of commercials and ads, in particular, if the producer has the right to
alter a client or the footage, it opens up the possibility of creating completely
new ads, potentially for a different product, without the client’s consent.
• Is there
security around the audio and video recordings and the associated data? Do they
have provisions in place to avoid unauthorized use or access? How is the
content protected? What are the steps if there is a data breach?
Both the recordings
and digital files of a client should be protected from unauthorized use and
access, in the same manner as any personal information. Not only can an
unauthorized use of a client’s voice and likeness harm the client’s brand or
earning potential, in the hands of a malicious actor, the high-quality digital
content can be used to create sexually explicit content, commit fraud, or
spread disinformation. It is also worth determining if the content will contain
any embedded technology that the producer can use to track content and if the
producer will similarly assist the client in the event of unauthorized
exploitation.
• If the
project is an AI project, what control does a client have to approve or deny
subsequent uses? Is the client comfortable with the fact that his or her voice
could be used in an advertisement or other content for something the client
might oppose?
These questions are
particularly important for actors who work on commercials or who have other
brand affiliations. Exclusivity is a critical component of many brand deals—an
actor, spokesperson, influencer, or any other recognizable person cannot
simultaneously be associated with, let alone be the face and/or voice of, two
potentially conflicting brands. If a client lends his or her likeness or voice
to a company that provides AI spokespeople or characters, without any approval
rights or constraints on usage, the client risks foreclosing entire segments of
future work.
Many of these
platforms may engage talent to do this work early in their careers, often
before they have engaged professional representation. They might be excited to
work in this new space and not yet have an understanding about how this could
impact their future job opportunities and earning capabilities, particularly if
they are hoping to do commercials or have brand deals. The perpetual grant of
rights, coupled with lack of control, is a significant risk for any performer
but particularly for the performer just getting started.
• If the
company does not allow project-by-project consent, does it at least have an
ethics policy relating to how a client’s likeness or voice will be used? What
rights does the client have if it changes?
Many of the
companies developing these AI technologies come from the technology sector,
rather than the entertainment sector, and they lack the types of protections or
approval rights actors have come to expect. Many lack any approval rights at
all. If the company lacks approval rights or will not negotiate them, it is
important to determine if it at least has an ethics policy regarding how the
content might be used and what rights a client might have to remove content
from the company’s site should that policy change. While an ethics policy will
not give a client control over his or her likeness or voice, it at least will
give some reassurance as to how the client will be depicted.
• What
indemnification has the client agreed to? What is the client receiving? Are
there content carve-outs?
Unlike most acting
work, a client likely will not have a copy of the script that will be used for
a digital counterpart, and therefore will not be able to object to troubling,
offensive, or even legally problematic content. One way to address this is to
include carve-outs or consent requirements for certain types of content, such
as profanity, sexual content, religious content, or the endorsement or advocacy
for political positions or candidates. This can be coupled with indemnification
for, at a minimum, content that is defamatory, casts the client in a false
light, or is otherwise unlawful. At the same time, the indemnity the client is
granting should be narrowly tailored and not extend to the content.
• Is the
client a SAG-AFTRA member? If so, is the project signatory to a SAG-AFTRA
agreement?
Voiceover and
recorded performances in many of these new and evolving areas are within
SAG-AFTRA’s jurisdiction, and SAG-AFTRA has been actively working to ensure its
contracts keep up with technology. For the media professionals SAG-AFTRA
represents, it is important to understand the implications of working on a
non-union project in these areas. It not only risks running afoul of
SAG-AFTRA’s Global Rule One but also means not having the union’s protections
and support in the event of a dispute. SAG-AFTRA’s contracts in these areas
recognize that these are evolving technologies and business models and have a
degree of flexibility while still providing the types of minimum protections
that have been discussed.
There is a rush to
work in this exciting new space, but performers and models need to stop and
think about the short- and long-term implications and risks that go with it. Is
the upfront payment and excitement worth the potential long-term risk of
overexposure or being associated with a product, company, or cause they do not
support? Attorneys representing talent similarly need to be aware and keep apprised
of the changing technology to properly advise their clients. As illustrated by
Beverly Standing’s case, if a client does not maintain control over how his or
her voice and likeness will be used, there is a risk it may be used in ways
least expected.
[Danielle S. Van Lier is
assistant general counsel for intellectual property and contracts at SAG-AFTRA
in Los Angeles.]