If you sat in a middle-school classroom this winter and watched a kid swipe through an iPad, you might have missed the moment the class lost him. I saw a feed that never stopped. You could feel the attention drain in real time.
Ben Warren logged 13,000 YouTube views on a school Google account.
I read the Wall Street Journal report and my first reaction was disbelief; then the numbers sank in. Between December 2024 and February 2025, a seventh grader in Wichita, Kansas, reportedly racked up roughly 13,000 YouTube views on a school-managed Google account during school hours. That kind of kit-and-classroom binge is not about long videos from creators such as MrBeast — it’s about short-form Shorts swiped like a stream of tiny notifications.
A student in Oregon managed 200 views in a single morning; another watched four hours in one day.
You’re not seeing isolated mischief. The Journal found multiple examples: a tenth-grader hit 200 views before lunch, and another child wound up in a treatment program at Boston Children’s Hospital after watching hours daily. These anecdotes stack into a pattern: institutional access plus platform design equals sustained consumption.
How did a student watch 13,000 YouTube videos during school?
The short answer: device provisioning, permissive settings, and human behavior. Schools issue Chromebooks and iPads tied to Google accounts, and many rely on default content rules or soft filtering. Students learn workarounds fast; when the path of least resistance is a never-ending feed, you watch. I’d compare the setup to a Trojan horse: what starts as a tool for learning can carry an attention problem inside.
School-managed devices often carry the same recommendation engine kids use at home.
Google and YouTube are built to keep you watching. Administrators can apply policies, but setting and enforcement vary widely by district. YouTube Shorts are optimized for rapid consumption and algorithmic reinforcement — the feed was Pac-Man, consuming attention pellet by pellet.
Can schools block YouTube on student devices?
Yes, but it’s messy. IT teams can use Google Workspace for Education controls, network filters, or third-party MDM and content-filtering tools to limit access. The trade-offs are real: block too much and you break educational content; block too little and the platform keeps spinning. Amy Warren, Ben’s mother and now a Wichita Board of Education member, is wrestling publicly with that balance.
Legal and cultural pressure is changing the conversation about platform responsibility.
Last month a 20-year-old, identified in court as Kaley G.M., won a $3,000,000 (≈€2,800,000) jury award against Google and Meta, claiming addictive harm from their recommendation systems. Google spokesperson José Castañeda said the company plans to appeal and maintained that YouTube is a “responsibly built streaming platform, not a social media site.” That framing matters: platforms are arguing design intent while plaintiffs point to outcomes.
Shorts, Fortnite clips, and influencer headshots are the bait; school time is the feeding ground.
The Wichita case shows what happens when you combine high-engagement content genres — gaming highlight reels, influencer-facing selfies, meme fragments — with institutional access and minimal friction. Kids will follow whatever rewards the algorithm hands them, even if they can’t play Fortnite at home. That behavior isn’t accidental; it’s product design meeting teenage curiosity.
Is YouTube addictive for kids?
Addiction is a clinical term, but research and court cases suggest platforms can produce compulsive patterns, especially in developing brains. Clinics like Boston Children’s Hospital are already treating media-related behavior, and lawsuits are pushing tech companies into public scrutiny. I’m not declaring every kid addicted, but the convergence of device access, algorithmic curation, and school policies creates a high-risk mix.
Districts have choices, and so do you as a parent or educator.
If you manage devices, audit account settings in Google Workspace for Education, enable SafeSearch and restricted modes, and review MDM rules for app access. If you’re a parent, ask about school policies and request transparency on filtering. You can also press for digital-literacy lessons that explain why a personalized feed is designed to keep you watching.
We’re watching a tipping point where platform mechanics, school technology, and young attention collide — and the fixes will be technical, policy-driven, and cultural. Who gets to decide what a school-issued screen is for, and who will hold platforms accountable for what those screens become?