Will Scarlett Johansson sue OpenAI for creating a voice assistant that sounds like the actor’s performance in the 2013 film “Her,” about a man who falls in love with an artificial intelligence?
That’s how things could go after Johansson said OpenAI tried to hire her to voice an AI assistant for ChatGPT and, when she refused, forged ahead with a sound-alike voice. OpenAI’s co-founder and CEO, Sam Altman, could be directly in the crosshairs of such a lawsuit.
Now, legal experts say Johansson may have a powerful and credible claim in court if she does decide to sue, pointing to a long string of past cases that could lead to significant damages for one of the world’s leading AI companies and raise questions about the industry’s readiness to deal with AI’s many messy complications.
That OpenAI was apparently unaware of that legal history, or at worst willfully ignorant of it, highlights what some critics say is a lack of industry oversight in AI and a need for greater protections for creators.
OpenAI didn’t immediately respond to a request for comment.
OpenAI’s legal risk
There are two types of law that could potentially be involved here, according to legal experts, but only one is likely to come into play based on the currently known facts.
The first is copyright law. If OpenAI had directly sampled Johansson’s films or other published works to create Sky, the flirty voice assistant demoed in an update to ChatGPT, then OpenAI might face a copyright problem if the company didn’t obtain permission beforehand.
That doesn’t appear to be the case, at least based on OpenAI’s past statements. The company claims not to have used Johansson’s actual voice, the company said in a blog post Sunday, but rather “a different professional actress using her own natural speaking voice.”
While that might be enough to deflect a copyright claim, it almost certainly wouldn’t insulate OpenAI from the second type of law at issue, according to Tiffany Li, a law professor focused on intellectual property and technology at the University of San Francisco.
“It doesn’t matter if OpenAI used any of Scarlett Johansson’s actual voice samples,” Li posted on Threads. “She still has a viable right of publicity case here.”
How publicity rights laws work
Multiple states have right-of-publicity laws that protect individuals’ likenesses from being stolen or misused, and California’s — where Hollywood and OpenAI are based — is among the strongest.
The California law prohibits the unauthorized use of anyone’s “name, voice, signature, photograph, or likeness” for the purposes of “advertising or selling, or soliciting purchases of, products, merchandise, goods or services.”
Unlike a copyright claim, which is about intellectual property, a right-of-publicity claim is more about the unauthorized use of a person’s identity or public persona for profit. Here, Johansson could accuse OpenAI of illegally monetizing who she is by essentially fooling users into thinking she had voiced Sky.
One defense OpenAI could mount is that its now-viral videos showcasing Sky’s capabilities weren’t technically made as advertisements or meant to drive purchases, said John Bergmayer, legal director at Public Knowledge, a consumer advocacy group. But, he added, it could be a rather thin argument.
“I believe use in a highly hyped promo video or presentation easily meets that test,” he said.
In addition to saying it never used Johansson’s actual voice and that its videos weren’t advertisements, OpenAI could also say it never intended to precisely mimic Johansson. But there’s substantial case law — and one very inconvenient fact for OpenAI — undercutting that defense, legal experts say.
A Bette Midler precedent
There are roughly a half-dozen cases in this space that show how OpenAI may land in hot water. Here are two of the biggest ones.
In 1988, the singer Bette Midler won a lawsuit against Ford Motor Company over an advertisement featuring what sounded like her voice. In fact, the song in the ad had been recorded by one of Midler’s backup singers after Midler turned down the opportunity to record the ad. The similarities between the reproduction and the original were so striking that some people told Midler they believed she had performed in the commercial.
The US Court of Appeals for the 9th Circuit ruled in Midler’s favor.
“Why did the defendants ask Midler to sing if her voice was not of value to them?” the court wrote in its decision. “Why did they studiously acquire the services of a sound-alike and instruct her to imitate Midler if Midler’s voice was not of value to them? What they sought was an attribute of Midler’s identity. Its value was what the market would have paid for Midler to have sung the commercial in person.”
In a similar case decided by the 9th Circuit in 1992, the singer Tom Waits won $2.6 million in damages against the snack food maker Frito-Lay over a Doritos ad that featured an imitation of Waits’ signature raspy voice. The court in that case doubled down on its decision in Midler, further enshrining the idea that California’s right of publicity law protects a person’s voice.
The situation involving Johansson and OpenAI bears a remarkable resemblance to these prior cases. According to Johansson, OpenAI approached her to perform as Sky; Johansson declined. Then, months later, OpenAI released a version of Sky that was widely compared to Johansson, to the point that Johansson said her “closest friends … could not tell the difference.”
Whether OpenAI can survive a potential publicity rights claim may hinge on intent — that is, whether the company can prove it did not set out to imitate Johansson’s voice, said James Grimmelmann, a law professor at Cornell University.
In its Sunday blog post, OpenAI said that Sky was “not an imitation of Scarlett Johansson” but that with each of its AI voices, the company’s goal was simply to create “an approachable voice that inspires trust,” one that contains a “rich tone” and is “natural and easy to listen to.”
On Monday evening, Altman responded to Johansson’s statement with one of his own, claiming that the company “cast the voice actor behind Sky’s voice before any outreach to Ms. Johansson” and apologizing for not communicating better.
But OpenAI may have already undermined itself.
“OpenAI might have had a plausible case if they hadn’t spent the last two weeks hinting to everyone that they had just created Samantha from ‘Her,’” Grimmelmann said, referring to Johansson’s character from the 2013 film. “There was widespread public recognition that Sky was Samantha, and intentionally so.”
The widespread parallels users were drawing with Johansson were reinforced when Altman posted to X on the day of the product’s announcement: “her.” Johansson’s statement said Altman used this post to insinuate “the similarity was intentional.” As recently as last fall, Altman was telling audiences that “Her” was not only “incredibly prophetic” but also his own favorite science-fiction film.
Taken together, those facts suggest OpenAI may have wanted users to implicitly associate Sky with Johansson in ways that California’s law is interpreted to prevent.
Altman’s post was “incredibly unwise,” Bergmayer said. “Given the facts here — the negotiations, the tweet — even if OpenAI was using an actress who just happens to sound like Johansson, I think there’s still a strong chance they’d be liable.”
Lost in deepfake translation
The situation involving Johansson is a high-profile example of what can go wrong in the age of deepfakes and AI. While California’s publicity law protects all individuals, some state statutes only protect famous people, and not all states have such legislation on the books.
What’s more, those existing laws may protect a person’s image or even voice but may not cover some of the things you can now do with AI, such as asking a model to recreate art “in the style” of a famous artist.
“This situation does show why we need a federal right of publicity law, since not every case will conveniently involve California,” Bergmayer said.
Some tech companies have gotten involved. Adobe, the maker of Photoshop, has pushed a proposal it is calling the FAIR Act to create a federal right against impersonation by AI. The company argues that while it is in the business of selling AI tools as part of its creative software, it has a vested interest in ensuring its customers can continue to reap the rewards of their own work.
“The worry you have as a creator is that AI is going to displace their economic livelihood because it’s training on their work,” said Dana Rao, Adobe’s general counsel and chief trust officer. “That’s the existential angst that you’re feeling out there in the community. And what we’re saying at Adobe is that we’re always going to provide the world’s greatest technology to our creators [but that] we do believe in responsible innovation.”
Some US lawmakers are working on proposals to address the issue. Last year, a bipartisan group of senators unveiled a discussion draft of the NO FAKES Act, a bill intended to protect creators. Another draft bill, in the House, is known as the No AI Fraud Act.
But digital rights groups and academics have warned that the legislation is far from perfect, leaving gaping loopholes in some areas while also creating potential unintended consequences in others.
Questions abound about protecting free expression, such as the ability for people to use others’ likenesses for educational or other non-commercial uses, as well as rights to a person’s image after death — important in the recreation of dead actors in movies or music, which could ultimately harm living performers, according to Jennifer Rothman, an intellectual property expert and law professor at the University of Pennsylvania.
“This opens the door for record labels to cheaply create AI-generated performances, including by dead celebrities, and exploit this lucrative option over more costly performances by living humans,” Rothman wrote in a blog post in October on the NO FAKES Act.
The debate over publicity rights in Congress is part of a much broader effort by lawmakers to wrestle with AI, one that isn’t likely to be resolved anytime soon — and reflecting the complexity of the issues at stake.