The Challenges Hiding in Hollywood’s Landmark AI Deal

Actors have the right of publicity, also known as likeness rights, to safeguard themselves if a studio violates their image. However, when it comes to a synthetic performer that embodies, for instance, the charisma of Denzel Washington without actually being him, could this constitute a “digital replica,” as stipulated by the contract, which requires consent for use? How readily can an actor defend more abstract characteristics? With some legal basis, a studio might argue that its AI performer is simply trained on the performances of great actors, similar to the way in which a large language model “digests” influential literary works to influence its own output. (Whether or not LLMs should be permitted to do this is an ongoing topic of debate.)

“Where does that line lie between a digital replica and a derived look-alike that’s close, but not exactly a replica?” asks David Gunkel, a professor in the Department of Communications at Northern Illinois University specializing in AI in media and entertainment. “This is something that’s going to be litigated in the future, as we see lawsuits brought by various groups, as people start testing that boundary, because it’s not well defined within the terms of the contract.”

The vagueness of certain language in the contract raises further concerns. For instance, the stipulation that studios are not obligated to seek consent “if they would be protected by the First Amendment (e.g., comment, criticism, scholarship, satire or parody, use in a docudrama, or historical or biographical work).” It’s conceivable for studios to potentially bypass consent by categorizing a use as satirical and leveraging the protection of the US Constitution.

Then there’s the discussion surrounding digital alterations, particularly the provision that no consent is necessary for a digital replica if “the photography or sound track remains substantially as scripted, performed and/or recorded.” This could encompass changes to hair and wardrobe, as Glick mentions, or notably, a gesture or facial expression. This, in turn, prompts the question of AI’s impact on the art of acting: Might artists and actors begin to indicate AI-free performances or advocate anti-AI movements, Dogme 95-style? (These concerns echo past industry disputes about CGI.)

The vulnerability of performers leaves them open to exploitation. If an actor needs to make ends meet, AI consent, and potential replication, may one day become a condition of employment. Inequality among actors is also likely to intensify—those who can afford to resist AI projects may receive greater protection; prominent actors who consent to digital recreation can “appear” in multiple projects simultaneously.

There are limitations to what can be accomplished in negotiations between guilds and studios, as actor and director Alex Winter highlighted in a recent article for WIRED. Like his observations regarding the WGA agreement, the deal “puts a lot of trust in studios to do the right thing.” Its primary achievement, he contends, is maintaining the dialogue between labor and capital. “It’s a step in the right direction regarding worker protection; it does shift some of the control out of the hands of the studio and into the hands of the workers who are unionized under SAG-AFTRA,” says Gunkel. “I do think, though, because it is limited to one contract for a very precise period of time, that it isn’t something we should just celebrate and be done with.”

Source link

Leave a Comment