Scarlett Johansson Threatens Legal Action Over OpenAI Voice
Summary
Actress Scarlett Johansson publicly accused OpenAI of creating a ChatGPT voice ("Sky") that deliberately mimicked her voice after she had twice declined Sam Altman's personal requests to voice the system. The controversy highlighted issues of consent, likeness rights, and the power dynamics between AI companies and the individuals whose work and identity they seek to use.
What Happened
On May 20, 2024, Scarlett Johansson released a statement revealing that OpenAI CEO Sam Altman had personally approached her in September 2023, asking her to provide the voice for ChatGPT's voice interface. Johansson declined. Altman reached out again just two days before the GPT-4o launch, and Johansson again declined — but before she could respond to the second request, the launch proceeded with a voice called "Sky" that many observers noted sounded strikingly similar to her.
The connection was reinforced by Altman's post on X consisting of a single word — "her" — a reference to the 2013 Spike Jonze film in which Johansson voices an AI assistant named Samantha. The post was later deleted.
Johansson said she was "shocked, angered, and in disbelief" and retained legal counsel. She asked OpenAI for a detailed explanation of how the Sky voice was developed. OpenAI paused the use of the Sky voice and stated that it was voiced by a different, unnamed professional actress and was "never intended to resemble" Johansson — a claim that Johansson and many observers found difficult to reconcile with the surrounding evidence.
Why It Matters
The Johansson incident crystallized a set of issues that extended far beyond one celebrity's voice. It demonstrated how AI companies could create outputs that effectively replicated someone's identity — their voice, their likeness, their brand — while maintaining plausible deniability about the intent to do so.
The episode also revealed the power asymmetry between AI companies and individuals: even a globally famous actress with substantial legal resources struggled to prevent or quickly remedy what she perceived as a violation of her rights. For less prominent individuals — voice actors, musicians, artists — the challenge of defending against AI replication was vastly greater.
Altman's "her" post was perhaps the most telling element: it suggested that the resemblance was not accidental but aspirational, that OpenAI wanted ChatGPT to evoke the AI companion from the film. This desire — to make AI feel personal, intimate, human-like — sits at the heart of the tension between AI capability and ethical boundaries around consent and identity.