I was on a call that felt suddenly small and sharp. You heard the sentence, too: “If they don’t have the ability in their contract to remove their byline, we’re going to use their name.” I watched names — and trust — get repackaged in real time.
At a staff meeting, a manager read a policy and shrugged — McClatchy’s rollout, reported by TheWrap, looks like the start of a legal and ethical scramble
I’ve been tracking newsroom AI for years, and this felt different. McClatchy, the owner of papers like the Centre Daily Times, the Sacramento Bee and the Miami Herald, is piloting a Claude-based tool called the content scaling agent (CSA).
TheWrap’s Corbin Bolies reviewed internal materials and examples: short AI-produced pieces credited in a way that puts human names on work the system assembled. One sample read, “Reporting by [author redacted]. Produced with AI assistance.” Another at a union paper read “Edited by [editor redacted]. Story produced with AI assistance.”
The Centre Daily Times ran a two-paragraph AI summary with a link to a human story — what happened to the original reporting?
You open the short piece and it’s tidy: two paragraphs, five bullet-point highlights, and a gateway link to the full, 1,200-word report with six graphics. On paper that looks efficient. For the reporter, it can feel like your work has been photocopied and trimmed without permission — a paper trail turned ghost story.
Editors describe the CSA as a “writing partner” that handles mechanical adaptation so journalists can focus on judgment and voice. Reporters hear something else: a system that scales content for different audiences and, in some cases, inherits their byline.
Can a newspaper put your name on AI-written articles without permission?
Short answer: it depends on contracts and union rules. At McClatchy, union grievances argue the CSA was introduced without the required notice for “major technological change.” Non-union outlets, like the Centre Daily Times, are more exposed. Union papers, like the Miami Herald and the Sacramento Bee, show varying byline language — one keeps the author name with a different spin, another omits the byline entirely.
A union filed grievances after the rollout — the line between process and power is shrinking
I sat through union calls where attorneys flagged contract language that requires notice before big tech changes. When managers answer, “If they don’t have the ability in their contract to remove their byline, we’re going to use their name,” you can hear the legal posture shifting toward operational convenience.
That posture matters. Unions say McClatchy didn’t properly notify them. Company staffers say the tool automates adaptations and saves time. The tension is not just legal: it’s about trust, career credit, and accountability for factual accuracy when an AI composes or compiles text.
What rights do journalists have over AI-generated content?
There’s no single answer. Rights live in contracts, union agreements, and newsroom policies. Where unions are strong, reporters can demand negotiation and safeguards. Where they aren’t, companies can set byline rules that bend credit toward the publisher.
The technology itself shows up in the language — Claude, CSA, “produced with AI assistance” — but language masks labor
The CSA’s pitch: create summaries of varying lengths and versions for different audiences. In practice, that can mean a reporter’s longform work is reduced to a few AI-crafted paragraphs carrying the reporter’s name. It felt like a factory line for bylines.
Anthropic’s Claude and tools from other firms are already in newsrooms as assistants. The difference between assistance and replacement comes down to policy, transparency, and who owns the editorial choices.
How is McClatchy using AI in newsrooms?
According to TheWrap, McClatchy tested the CSA across properties. Internal documents call it a “writing partner” that does the mechanical lifting. Examples show AI summaries linked back to original stories; byline language varies by title and union status. Grievances allege the rollout violated collective bargaining obligations.
Reporting still matters — and so does perception — outside attorneys and engineers, readers and sources react
You and I judge news by bylines. Names carry accountability. When an editor pastes a human name onto a piece the system created, readers assume a reporter did the reporting. Sources expect follow-up. Trust can fray when credits misrepresent labor.
Companies argue efficiency and scale. Reporters argue for credit and control. The question for publishers is whether speed and reach are worth trading the human signal that a byline provides.
I’ve linked to Corbin Bolies’ reporting at TheWrap because the documentation matters; his examples anchor the abstract into newsroom reality. McClatchy did not respond to requests for comment at the time Bolies published, and the unions filed grievances at the Miami Herald, Sacramento Bee and Kansas City Star.
The technology will keep moving. The legal fights will reshape contracts. But what happens to the public’s trust when a human name appears on work assembled by an AI — and who answers when errors slip through?