What would the case of OpenAI look like in court?
An OpenAI spokesperson denies forcing a reversal of the ChatGPT demo due to the voice of Scarlett Johansson
Johansson’s statement, relayed to WIRED by her publicist, claims that OpenAI CEO Sam Altman asked her last September to provide ChatGPT’s new voice but that she declined. She describes being astounded to see the company demo a new voice for ChatGPT last week that sounded like her anyway.
On Monday, Johansson issued a statement claiming to have forced that reversal, after her lawyers demanded OpenAI clarify how the new voice was created.
“When I heard the release demo I was shocked, angered, and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference,” the statement reads. It notes that Altman appeared to encourage the world to connect the demo with Johansson’s performance by tweeting out “her,” in reference to the movie, on May 13.
“The voice of Sky is not Scarlett Johansson’s, and it was never intended to resemble hers,” Sam Altman said in a statement provided by OpenAI. He said the actor who was behind Sky’s voice was hired before the company talked to them. “Out of respect for Ms. Johansson, we have paused using Sky’s voice in our products. We are sorry to Ms. Johansson that we didn’t communicate better.”
The dispute with OpenAI drew notice, as the company is involved in a number of lawsuits. The company is accused of breaching their copyrights by using creative work to train the artificial intelligence models. It is not possible for one to copyright a voice, so it is unlikely that the law would play a part for Johansson. “It would be right of publicity,” says Brian L. Frye, a professor at the University of Kentucky’s College of Law focusing on intellectual property. She wouldn’t have any other claims.
Generative AI has made it much easier to create realistic synthetic voices, creating new opportunities and threats. In January, voters in New Hampshire were bombarded with robocalls featuring a deepfaked voice message from Joe Biden. In March, OpenAI said it had developed a technology that could clone a voice in a 15-second clip, but it would not be released because of how it might be used.
Replicating voice, name, and likeness: a comment by Pattrola Albers on California’s right to publicity laws
Purvi Pattrola Albers says there are a few courses of actions she can take, but that case law supports her position.
Albers says that Johansson and other celebrities can invoke right to publicity laws, which protect identifying features of a person from being used without their permission. “If you misappropriate someone’s name, likeness, or voice, you could be violating their right to publicity,” Albers says.
“The Ninth Circuit held that a celebrity with a distinctive voice could recover against someone who used a voice impersonator to create the impression that the celebrity had endorsed the product or was speaking in the advertisement,” Mammen says.
It’s difficult because there’s no federal right to publicity laws and not all states have one on the books. New York has a likeness law that allows individual’s to control the commercial use of their signature, picture, voice, and even their name. This right extends to a deceased person, whose estate must give prior consent for the use of a computer-generated replica. California, where OpenAI is headquartered, does not mention using digital replicas like AI-generated voices in its law. But California protects a living person’s voice from being used in commercial activities without consent. It says use of a person’s “identity” includes a voice, face, or name.