Meta Begins Removing Australian Users Under 16 in World-First Social Media Crackdown

SYDNEY– In a landmark move for online safety regulation, Meta has begun removing hundreds of thousands of Australian children from its social media platforms, enforcing a controversial new law one week before its official start date.
The company started notifying and shutting down accounts for users it identified as being between 13 and 15 years old on December 4. An estimated 350,000 Instagram and 150,000 Facebook accounts are affected. Because Threads requires an Instagram account, access to it is also blocked.
This action precedes the official commencement of Australia's world-first "Social Media Minimum Age" law on December 10. From that date, platforms that fail to take "reasonable steps" to prevent users under 16 from having accounts face fines of up to A$49.5 million (US$33 million). The list of affected platforms includes TikTok, Snapchat, X, YouTube, Reddit, Kick, and Twitch.
Communications Minister Anika Wells said the law aims to protect "Generation Alpha" from being targeted by harmful algorithms, which she described as a "dopamine drip". The government cites research showing high rates of exposure to harmful content and cyberbullying among Australian children on social media.
Key Facts
- Action Taken: Meta began deactivating accounts of users under 16 on December 4, 2025.
- Official Law Start: The Social Media Minimum Age law formally takes effect on December 10, 2025.
- Platforms Affected: Facebook, Instagram, Threads, TikTok, Snapchat, X, YouTube, Reddit, Kick, and Twitch must comply.
- Penalty for Non-Compliance: Companies face fines of up to A$49.5 million.
- User Recourse: Teens can download their data before deactivation. Those wrongly identified can submit a "video selfie" or ID for review.
- Legal Challenge: The Digital Freedom Project has filed a challenge in Australia's High Court.
Verified
A Meta spokesperson stated: "While Meta is committed to complying with the law, we believe a more effective, standardised, and privacy-preserving approach is needed". The company advocates for app stores, not individual platforms, to verify age at the point of download.
What We Know So Far
- Meta's compliance is described as an "ongoing and multi-layered process".
- YouTube, originally exempt but later included, has criticized the law as "rushed," arguing it could make the platform "less safe" by stripping away parental controls for account holders.
- Some apps like WhatsApp, Messenger, and Roblox are currently exempt from the ban.
- The eSafety Commissioner will require monthly reports from platforms on removed accounts for six months starting December 11.
- The law has sparked global interest, with countries like New Zealand and Malaysia considering similar measures.
