Applying synthetic voices to books also brings new business and cultural challenges. “Most of the companies developing these AI technologies come from the technology sector, rather than the entertainment sector,” says SAG-AFTRA’s Love. “They lack the relationships, history of protections, and reliance on approval rights voice actors have come to expect.”
Several authors told WIRED that Findaway has emerged as a reliable distributor, offering lucrative deals to list audiobooks across several platforms. But they also say that Findaway frequently prompts people to agree to updated agreements, usually with minor changes, when they log in to their accounts. The company added the machine learning clause to its distribution agreements in 2019.
Many suspect they signed off on the machine learning clause without realizing it. “It’s on me for not initially noticing the addition and what it fully meant,” says Laura VanArendonk Baugh, an author based in Indianapolis, Indiana. “But the placement was kinda sneaky, too.”
Matthew Sag, a law and AI professor at the Emory University Law School, in Atlanta, says Spotify and Apple are probably legally in the clear unless a narrator explicitly prohibited such use of their audio in their contract with an author, or if Apple produced a like-for-like AI clone of their voice. “In terms of copyright law,” he says, “the voice actors have almost inevitably assigned all their copyright to the studio or publisher that made the initial recording.”
Morally, authors and narrators feel it’s a different story. Jon Stine, executive director of the Open Voice Network (OVON), a nonprofit from the Linux Foundation developing ethical guidelines for conversational AI, says Findaway has breached several ethical principles by not seeking narrators’ consent or enabling proper compensation for the owner of a voice.
Some use of synthetic voices is inevitable, he says, and the only way narrators can protect their art is with contracts that clearly spell out “usage rights and compensation.” OVON has developed a standard contract that clearly spells out usage rights and compensation to help actors navigate relationships with synthetic voice firms, Stine added.
While NAVA’s Friedlander agrees that, in general, use of synthetic voices is “not an inherently bad thing” and can help less well-known authors have work narrated, it can potentially damage the livelihoods of lesser-known voice actors. For this technology to advance ethically, Friedlander says, legislation is needed to prevent “unauthorized sharing and synthesization of voices.”
Until that happens, Lillian Rachel, a voice actor who deleted her Findaway Voices profile after learning about the machine learning clause, is hopeful of listeners’ faith in the human-to-human connection.
“A good actor does more than just read the story. They imbue it with core emotions and bring out the subtext, elevating the written words with empathy and nuance,” Rachel says. “We bring the human lived experience to each story in a way that cannot be replicated.”
Updated 02/14/2023, 1:10 pm EST: The headline and subheadline have been updated to better reflect the nature of the dispute.