Actors Seek Control Over Body Scan Data Amid AI Concerns
Actor Olivia Williams is leading calls for greater control over the data collected from full-body scans on film and television sets, arguing performers deserve the same rights regarding their digital likeness as they do concerning depictions of nudity.
Williams, known for her roles in Dune: Prophecy and The Crown, stated that actors are frequently required to undergo extensive body scanning with limited clarity about how the resulting data will be utilized. She proposed adopting a “nudity rider” approach – restricting the use of scan data to the specific scene it was captured for, requiring deletion after editing, and preventing its use in other contexts. She highlighted contract clauses granting studios potentially unlimited rights to an actor’s likeness “on all platforms now existing or yet to be devised throughout the universe in perpetuity.” This debate arrives as the entertainment industry grapples with the rapid advancement of artificial intelligence and its potential to replicate and replace human performers.
The issue gained renewed attention following criticism of the AI-generated actor, Tilly Norwood. Williams shared her own struggles attempting to modify these clauses, finding legal fees prohibitive and the law unclear regarding ownership of her own scan data. “I don’t necessarily want to be paid any more money for the use of my likeness,” Williams said. “I just don’t want my likeness to appear in places where I haven’t been, doing things I haven’t done, saying things I haven’t said.” She expressed concern for younger actors, citing an instance of a 17-year-old being scanned with her grandmother’s consent, unaware of the implications. For more information on data privacy rights, see the Federal Trade Commission’s guidance.
Equity, the UK performing arts union, is currently in negotiations with Pact, the UK screen sector’s trade body, to establish AI protections within major film and TV agreements. “We’re demanding that AI protections are mainstreamed in the major film and TV agreements to put consent and transparency at the heart of scanning on set,” said Equity’s general secretary, Paul W Fleming. Pact acknowledged the concerns and stated the issues are being considered during ongoing negotiations, but declined to provide further detail. You can find more information about Equity’s work on their website.
Talks between Equity and Pact are expected to continue in the coming weeks, with both sides aiming to reach an agreement on standardized protections for performers’ digital likenesses.
Actors should have as much control over the data harvested from scans of their body as they do over nudity scenes, the actor Olivia Williams has said, amid heightened concern over artificial intelligence’s impact on performers.
The star of Dune: Prophecy and The Crown said she and other actors were regularly pressed to have their bodies scanned by banks of cameras while on set, with few guarantees about how the data would be used or where it would end up.
“A reasonable request would be to follow the precedent of the ‘nudity rider’,” she said. “This footage can only be used in the action of that scene. It cannot be used in any other context at all, and when the scene has been edited it must be deleted on all formats.”
Williams pointed to vague clauses in contracts that appeared to give studios wide-ranging rights over a performer’s likeness “on all platforms now existing or yet to be devised throughout the universe in perpetuity”.
A renewed debate over the impact of artificial intelligence on actors has been prompted by widespread condemnation of the creation of an AI actor known as Tilly Norwood. Actors fear the data could be used to train AI models on their likenesses or poses, paving the way for the technology to eventually take away work.
Performing and supporting actors, as well as stunt performers and dancers, have told the Guardian they have been “ambushed” into undertaking the body scans while on set. Several said they had no time to agree how the data produced would be treated, or whether it could be used to train AI models.
Williams said she had tried and failed to have wide-ranging clauses removed from her contracts. She also investigated how to own her own body scan data to license it for limited use, but lawyers advised her the law was too unclear. Legal fees for her attempts to reclaim her data proved too high.
“I don’t necessarily want to be paid any more money for the use of my likeness,” she said. “I just don’t want my likeness to appear in places where I haven’t been, doing things I haven’t done, saying things I haven’t said.
“They make up the law as they go along and no one is stopping them – creating a precedent, reinforcing the precedent. I sign it, because if I don’t, I lose the job.”
Williams said she was speaking out for the sake of young actors who faced little choice but to go through the scans, with few guarantees about what would happen to the data. “I have known a 17-year-old who was persuaded into a scanner – and like the child-catcher scene in Chitty Chitty Bang Bang, she obliged,” she said. “She was a minor, so her chaperone had to give consent. Her chaperone was her grandmother, unaware of the law.”
The issue is the subject of talks between Equity, the UK performing arts union, and Pact, the UK screen sector’s trade body. “We’re demanding that AI protections are mainstreamed in the major film and TV agreements to put consent and transparency at the heart of scanning on set,” said Paul W Fleming, Equity’s general secretary.
“It is within the industry’s reach to implement basic minimum standards which would be a gamechanger for performers and artists working in UK TV and film.”
Pact said in a statement: “Producers are well aware of their obligations under data protection law and these issues are being considered as part of the collective negotiations between Pact and Equity. As the negotiations are ongoing, we cannot comment in any detail.”