I watched a message land on a 13-year-old’s screen at 11:42 p.m. and felt my confidence in a newly passed law slip away. You and I both know a rule on paper doesn’t always stop behavior in practice. That moment made one thing obvious: the ban is porous.
I’ve read the YouthInsight survey for the Molly Rose Foundation and sat with the numbers so you don’t have to squint at them. The short version: Australian teens are telling researchers they’re still on the apps the law was supposed to shut them out of.
A 14-year-old told a pollster she still uses TikTok — what the survey shows
The online survey of 1,050 Australians aged 12–15 was run by YouthInsight in partnership with the Molly Rose Foundation about three months after the law landed. More than 60% of kids who were users before the ban said they still have access to at least one of the platforms now off-limits to under-16s.
Which platforms kept the most users? YouTube, TikTok and Instagram reportedly retained about half of their under-16 audience in Australia. The list the law initially targeted includes TikTok, Facebook, Instagram, Threads, X, Snapchat, YouTube, Reddit, Kick and Twitch.
How are Australian teens bypassing the social media ban?
The survey paints a mix of methods: shared accounts, parents setting up profiles, fudged ages at signup, and tech workarounds. Two-thirds of respondents said the platforms themselves have taken “no action” to remove them, which turns a legal restriction into a paper barrier—easy to ignore.
An investigator counted five million deactivated accounts — what regulators are doing
Australia reported roughly 5 million accounts deactivated or removed, according to AP reporting, but was still unsatisfied with platform compliance. Regulators have floated enforcement actions against Meta, Snapchat, TikTok and YouTube and warned of fines if companies don’t step up.
The law tasks the platforms with identifying and blocking under-16 users or facing penalties pegged at $33 million (about €30.7 million). That financial stick is supposed to prod action, but reporting suggests enforcement so far has been partial.
Is the social media ban enforceable?
Short answer: it can be enforced on paper, but enforcement faces practical limits. Platforms must build detection, verification and removal systems at scale. For now, the machinery isn’t performing for this age group, and compliance audits are underway.
A teenager shrugged at a regulator’s warning — how the platforms respond
On the ground, many teens say they haven’t been booted; official complaints say the platforms’ responses are uneven. If you read the responses from companies, they point to technical and privacy hurdles. If you talk to teens, they point to ease—the platforms provided no meaningful block.
The law has become a sieve, letting accounts trickle through. For regulators, the platforms are ghost towns when it comes to under-16 enforcement.
I’m not arguing the law was a bad idea; I’m pointing out a gap between intent and effect. YouthInsight’s snapshot is small and self-selecting—it surveyed kids online, so those who quit the apps entirely wouldn’t appear—but it’s an early warning flare.
Tools and figures matter here: YouthInsight’s youth panels, the Molly Rose Foundation’s advocacy, the Office of the eSafety Commissioner’s reviews, and pressure on companies led by Meta, ByteDance (TikTok) and Google/YouTube will shape the next moves. You should be watching both the tech fixes and the public penalties being discussed.
So, if the platforms aren’t policing under-16 users and the penalties aren’t yet a deterrent, what would actually stop a teenager from logging back on—are governments ready to follow through, or will families and schools be left to close the gap?