Berklee AI Course Sparks Student Backlash Over ‘Bots and Beats’

Berklee AI Course Sparks Student Backlash Over 'Bots and Beats'

I watched a freshman paste a petition QR code to a bulletin board and then stare at it like someone who’d just found a song they loved turned into static. A professor posted “Bots and Beats” in the course catalog and the room split between curiosity and dread. Within 48 hours the campus had a digital petition, a local news quote, and a trust fracture you could feel in the hallway.

I teach and report on culture-wrenching shifts, and you should know up front: this fight is as much about identity as it is about tools. You’ll see claims, counterclaims, and a messy legal drumbeat — but beneath the headlines there are students trying to protect what they value most: human authorship.

At the course catalog page, a new elective sits between songwriting and music tech

The class is called “Bots and Beats: AI and the Future of Songwriting.” The official blurb says students will “explore how music makers can use the latest AI tools to expand their craft, and how to avoid using those same tools in a way that hinders their craft,” and study “the impact of AI on the music industry (both helpful and harmful), and on the future careers of music makers.”

That description sounds reasonable until you note the assignment list: generate original lyrics, melodies, songs, and recordings in collaboration with AI. Rolling Stone and other outlets have already documented AI creeping into production workflows — often to create samples that sidestep licensing headaches — and Berklee is not inventing a conversation so much as formalizing it.

On a petition page, signatures multiply and frustration becomes a manifesto

Students posted an online petition that has more than 425 names and a short, sharp argument: AI models steal from tens of thousands of artists, rot the industry’s core, and harm the environment. They call for the course to be disbanded and declare, “There is no place for generative AI at art school.”

The anger lands on two fronts. One is ethical: students see generative systems like ChatGPT-style lyric models and generative-audio tools as machines that ingest human work and return approximations. The other is practical: many worry these systems will make certain jobs obsolete and cheapen the craft of songwriting.

Why are Berklee students protesting the AI course?

Because this feels like someone trying to teach you how to forge a signature instead of how to write one. The petition language is blunt: artists’ catalogs were used to train models without consent, and some students equate that to theft. That belief accelerated when news outlets reported lawsuits against AI music companies such as Suno and Udio — companies tied in the public mind to scraped catalogs and major-label complaints from organizations like the RIAA and Koda.

Can AI replace songwriters?

No algorithm has human history, lived pain, or a messy favorite record collection to riff off — yet. But AI can produce usable scaffolding: hooks, stems, demo melodies. For producers and studios it’s another tool. For creators learning craft, it can be a shortcut that risks atrophy in your own skills if you let it do the heavy lifting.

Is Berklee banning generative AI?

Not remotely. The administration told WBZ that as “an artist-first institution” it has a responsibility to prepare students for technologies that affect creative industries. That’s a line you’ll hear from many schools: teach the tool and the ethics, don’t bury your head in the sand.

In classrooms and comments, the course instructor’s ties sharpen the debate

Ben Camp is listed as the instructor. Their LinkedIn shows an advisory role with Suno, a generative-music platform that’s faced legal challenges and accusations of mass scraping. Camp also teaches a class called “Stealing from the Masters,” which some see as ironic and others view as a frank lesson in influence and appropriation.

There’s obvious friction when an instructor advises a company that major record labels have sued. To students that looks like conflict of interest. To others, it looks like bringing industry practice into the classroom. I don’t pretend to be neutral about that tension — you want mentors who have both craft and conscience.

Metaphor one: AI at a music school can feel like handing a painter a photocopier instead of a new brush.

Outside the legal filings, the argument boils down to careers and craft

A former student commented on the petition: if Berklee wants graduates to succeed, focus on job placement rather than training with tools that “steal from artists and make producers irrelevant.” That’s a raw, career-minded worry. Students are right to ask how new workflows map to employment in an industry still governed by licensing, publishing, and performance income.

Labels and rights organizations have started turning to courts. Lawsuits against Suno and Udio, and complaints by entities like Koda, make the legal environment unpredictable. If you’re learning to enter that market, you want clarity about what tools are safe and what practices could land you in the middle of a rights dispute.

Metaphor two: Right now the music industry looks like a cracked mirror: every song reflects back many hands, and machine-made facsimiles blur where one author ends and another begins.

A practical middle path, or a civil war on campus?

I think you should expect an ongoing struggle: students pushing for curricular restraint, faculty arguing for pedagogy that includes prevailing tools, and industry lawyers running parallel plays. You and I both know schools set the terms of professional preparation — Berklee chose to teach the tools and risk the backlash rather than pretend the tools don’t exist.

Where you land depends on whether you believe AI is a hammer that amplifies human intent or a magnet that will erode the value of authorship. The campus debate matters because it will shape how a generation of songwriters is trained, hired, and defended in court and culture.

So what side of the recording glass will you stand on when the next big track arrives: human-made, AI-assisted, or some uneasy marriage of both?