UK video game voice actor's union proposes new minimum rates and protections against AI and abusive NDAs

1 month ago 61

"Pay for UK performers has stagnated despite games being a multibillion-dollar industry"

A logo for voice actor union Equity, showing a composite image of a hand and a flag and a microphone Image credit: Equity

UK arts and entertainment union Equity have unveiled a raft of "best practice guidelines" for video game developers hiring voice actors, including some suggested minimum rates that are designed to address "systemic low pay" for performers. Other measures are designed to improve voice actor working conditions, and stop companies using their voices and likeness as fuel for generative AI tools without their consent. It's both a praiseworthy endeavour and an interesting breakdown of the voice-actor's trade.

"The videogames industry is a dynamic place to work but unethical practices undermine the profession," Equity explain on their official site. "Pay for UK performers has stagnated despite games being a multibillion-dollar industry with almost £200 million in tax breaks. Performers don't have the protections they need in the unregulated world of AI, the misuse of NDAs is common and health and safety is often lacking."

The best practice guidelines span fair pay, the need for voice actor consent when creating digital replicas or "training" AI software, provisions for safer workplaces (including guidance about intimacy coordinators and vocal chord stress), and stipulations about "clear, legal, enforceable, fair, proportionate and targeted" non-disclosure agreements.

As regards NDAs, Equity argues that: "At the moment, many of the NDA agreements we see reflect poorly upon the industry, often intimidating and isolating Performers from those that support them. Performers are forced to sign documents which they have no hope of understanding or amending.

"The result is that Performers often assume that they will be sued if they tell anyone anything about the production, even where they have been victims of or witnesses to a criminal offence," the page continues. "This is all the more shocking five years since the Harvey Weinstein scandal highlighted the appalling misuse of NDAs, shook the film industry, and ignited the #MeToo movement. By now, we would have expected Engagers to be following best practice and not repeat past mistakes."

The suggested minimum rates, meanwhile, offer some snippets of info about how specific types of voice-acting and motion capture are valued within video game projects. For example, they distinguish between voice lines that form part of the main plot and "atmospheric" or "world-building" voices - "no more than 300 scripted/recorded lines that do not further the story".

There are also "Walla voices", defined as "unscripted voices not assigned to a specific character, that do not further the story". Examples of walla voicework include crowd chatter and creature noises - the term apparently dates back to the early days of radio broadcasting, when certain shows found that having groups repeat "walla walla" produced an effective impression of organic background hubbub.

I like hearing about production niceties such as these. Now that I've learned the term, I find myself surprisingly laden with memories of walla voicework. There's a backing track of anomalous battlefield mitherings you hear during loading breaks for Total War: Warhammer 3, for example. I had that stuck in my head for a while, albeit partly because my review PC at the time was a sclerotic antique, and it sometimes took five minutes to load a battle.

As Eurogamer's Ed Nightingale reminds us, Equity's best practice guidelines come amid strike action from US actor's union SAG-AFTRA. In July, SAG-AFTRA spokespeople commented that "AI protections remain the sticking point" for the strike, following disagreements between the union and members about deals with individual companies who use the tech. "Eighteen months of negotiations have shown us that our employers are not interested in fair, reasonable AI protections, but rather flagrant exploitation," Interactive Media Agreement Negotiating Committee chair Sarah Elmaleh commented at the time.

"Atmospheric" and "walla" voice-acting, Equity observe in their new guidelines, are "one of the segments of our industry most threatened by AI", presumably because the uncanniness of AI-generated speech is harder to detect when it's buried in a crowd. While not participating in the SAG-AFTRA strike for legal reasons, Equity have plenty of advice and requests for developers who are thinking about making use of generative AI.

Amongst other things, they suggest that companies confirm in advance whether "the data recorded within the stipulated performance sessions will be used for the stated project only and not re-used in future titles" and that "a pre-purchase / integration fee should be paid when a developer, studio or publisher wishes to hold performance data in their 'library' for potential re-use on future projects".

If you found all this intriguing, I encourage you to read the full text over on Equity's site - this is just a scraping of its surfaces. You might also be intrigued by Ken Levine's thoughts on "turn-based dialogue". If you'd like to read more about the broad implications of generative AI technology, Mike Cook did a whole essay series on it earlier this year.

Continue reading