Former NPR Host Sues Google for Allegedly Copying His Voice for AI

Former NPR Host Sues Google for Allegedly Copying His Voice for AI

David Greene stood frozen, the fervor of his long career flashing before him. The distinctive timbre of his voice echoed through his mind as he learned about a disturbing development: Google was allegedly using his voice without consent. As a podcaster who had dedicated decades to his craft, the realization hit hard—his unique style was now a mere file for corporate use.

The controversy ignited when Google introduced Audio Overviews for its research tool, NotebookLM, in late 2024. This feature allowed users to transform detailed notes and documents into concise podcasts. What caught Greene’s attention was the male co-host of these AI-generated episodes, a voice that appeared to echo his own. In response, Greene has filed a lawsuit against the tech giant for failing to secure his permission or offer any compensation.

“Without his consent, Google sought to replicate Mr. Greene’s distinctive voice—a voice made iconic after decades of celebrated radio and public commentary—to create synthetic audio products that mimic his delivery, cadence, and persona,” states the complaint filed in a Santa Clara County court.

For nearly ten years, Greene co-hosted NPR’s revered Morning Edition, and he currently helms KCRW’s Left, Right & Center podcast. Public reception of Google’s new audio feature was initially enthusiastic; critics remarked on how the AI voices sounded surprisingly human-like. Forbes described the output as “eerily human,” while WIRED noted the authenticity in the cadence and performance.

Despite Google touting NotebookLM as one of its “breakout AI successes,” Greene’s lawsuit argues that the company misappropriated his identity and career to boost its profits—with no compensation to him. He first became aware of the uncanny resemblance through alerted colleagues and later enlisted an AI forensic firm, which confirmed the chilling findings: tests indicated a 53-60% likelihood that the AI voice was indeed based on Greene’s own.

“These allegations are baseless,” asserted Google spokesperson José Castañeda in a response to Gizmodo. “The sound of the male voice in NotebookLM’s Audio Overviews is based on a paid professional actor Google hired.” But Greene’s concerns echo a growing anxiety about the use of artistic and intellectual property within the AI realm, where the boundaries of consent and compensation remain murky.

The trend becomes particularly unsettling when voices—like Greene’s—can be manipulated to say anything at all. Take the case of actress Scarlett Johansson, who found herself embroiled in a similar battle with OpenAI, after they allegedly replicated her voice without approval, despite her previous refusals to collaborate.

How is AI affecting creative industries?

As AI technology expands into content creation, the ethical implications mount. Models require vast datasets for training, yet the lack of regulatory guidelines offers scant protection for those whose creations fuel these innovations. Artists like Greene, who have invested years mastering their craft, find themselves at risk of becoming mere outputs in a technological assembly line.

As we navigate this crisis, one question arises: should creative professionals surrender their voices and likenesses to technology without a fight, or is it time to reconsider who truly owns the sound of our identities?