AI might attract hundreds of millions of users and churn out answers, images, and videos with just a few words, but the technology isn’t all rose petals — it also has a few legal thorns.
Scarlett Johansson recently hired legal counsel after ChatGPT-maker OpenAI used a voice she called “eerily similar” to her own in its latest AI chatbot. Johansson said she turned down the company’s offer to voice the same chatbot more than a year before the release and stated that she was “shocked, angered and in disbelief” when she heard OpenAI’s public demo.
After Johansson’s legal counsel sent letters to OpenAI and its CEO Sam Altman, the company paused the voice “out of respect for Ms. Johansson.”
Related: Scarlett Johansson ‘Shocked’ That OpenAI Used a Voice ‘So Eerily Similar’ to Hers After Already Telling the Company ‘No’
But does Johansson have a case? Neil Elan, a business litigation attorney who is now senior counsel at law firm Los Angeles-based Stubbs Alderton & Markiles, LLP, told Entrepreneur that it would come down to several factors, including how similar the voice is and any if any potential authorizations took place, even if just implied.
“It would seem like there was no authorization, but potentially there may be a case of implied authorization,” Elan said.
Elan, whose areas of expertise include copyright, trademark, and publicity cases, notes that we don’t know the back-and-forth communication between the parties.
“Ultimately it comes down to how similar is the work and what was the process that went into it,” Elan said of intellectual property cases related to AI.
“If I can’t plagiarize a famous speech and take credit for it, AI can’t either,” he said.
How OpenAI created the AI voice could also help determine whether or not there is a legal case.
OpenAI has already stated that it used the voice of another professional voice actress, not of Johansson — but that might not make a difference.
“Even if someone else’s voice is used, the output is a voice like Scarlett Johansson’s,” Elan said. “Why does it sound so similar?”
Related: Emory University Funds, Suspends Student Over AI Tool: Lawsuit
Johansson’s push against OpenAI isn’t the first legal action taken against the company. Authors including Paul Tremblay and Sarah Silverman allege their books were part of datasets used to train AI without their consent.
The New York Times sued OpenAI in December over copyright infringement and other news organizations like The Intercept have followed suit.
More than 200 musicians signed a letter last month about AI’s “predatory” and “catastrophic” use in the music industry. Over 15,000 authors signed a statement last year asking big AI CEOs at OpenAI, Google, Microsoft, Meta, and IBM to credit and compensate writers before training AI with their work.
The question of where big AI firms get their training data has also been at the forefront of the AI conversation, with an April report revealing that cutting-edge text-to-video AI models may have been trained on YouTube videos without creators being aware of it.
Related: OpenAI Reportedly Used More Than a Million Hours of YouTube Videos to Train Its Latest AI Model
So does that mean non-famous creators are out of luck regarding unauthorized use of their voice or likeness? Not exactly, but the commercial image or voice of a non-celebrity person doesn’t have the same value as that of a public figure, Elan says.
While companies and businesses can’t use someone’s voice without consent, there probably wouldn’t be a strong case for monetary damages if unauthorized use did happen.
“The monetary award might not justify a case like that,” Elan said, adding that people still “have the right to protect their likeness.”