Australian parents are tracking where under-16s are migrating after the December 10 social media ban. Here’s the complete monitoring checklist based on what families are actually seeing.
What Happened on December 10
Australia became the world’s first country to ban social media for everyone under 16. Over one million teen accounts disappeared from Instagram, Facebook, TikTok, Snapchat, YouTube, X, Reddit, Threads, Twitch, and Kick.
Platforms face fines up to AU$49.5 million (£25 million) for failing to prevent under-16s from creating or maintaining accounts. But there are no penalties for teens or parents, it’s the platforms that are accountable.
Within hours, VPN searches spiked. Age verification was being defeated. And alternative platforms started gaining thousands of Australian users.
Parents need to know: Where are teens actually going?
The Intelligence Parents Are Sharing
Following my LinkedIn post about Australia’s Day One experience, Geoff Birkbeck shared a monitoring framework he developed for child psychology professionals in Western Australia. Geoff has a background in law enforcement and compliance with a focus on community safety.
Below is an adapted version with added context about why each category matters.
Category 1: Banned Apps (Under-16s Cannot Have Accounts)
If you see any of these installed on your child’s device, they’re not allowed under Australian law:
- Threads
- TikTok
- Snapchat
- YouTube (full version—YouTube Kids is allowed)
- Twitch
- Kick
- X (formerly Twitter)
The LinkedIn grey area: LinkedIn is technically banned, but the situation is complex. Year 9-12 students (ages 14-18) undertaking VET (Vocational Education and Training) qualifications—including School-Based Apprenticeships & Traineeships (SBATs)—often need LinkedIn profiles or professional networking accounts for mandatory work placements. Schools haven’t resolved how to handle this compliance conflict.
Category 2: Allowed But High Risk—Likely “Workaround” Apps
These platforms aren’t banned, but Australian teens are commonly switching to them. Parents report these appearing on devices within days of the ban:
- Discord — Group chat, voice channels, difficult to monitor
- WhatsApp — Meta-owned but exempt as a messaging app
- Telegram — Encrypted messaging, public channels
- Signal — Private messaging, harder for parents to supervise
- Kik — Known for connecting with strangers
- Amino — Topic-based communities
- Yubo / Monkey / ChatSpin — Omegle-style random video chat apps
- BeReal clones — New photo-sharing apps appearing regularly
- Smaller streaming apps — Trovo and similar platforms
Why this matters: These apps offer similar features to banned platforms (public posting, connecting with strangers, group chats) but fly under the regulatory radar. The eSafety Commissioner says they’ll add platforms to the banned list as they gain popularity, but critics call this “whack-a-mole.”
Category 3: Allowed But Should Be Monitored
These platforms aren’t banned because they’re not classified as “social media,” but kids use them socially. Parents often don’t think of these as social apps—but teens do:
- Messenger / Messenger Kids — Direct messaging without Instagram
- Roblox — DMs, voice chat, and community features. Escaped the ban by agreeing to introduce age verification for chat features this month
- Steam + Steam Chat — Gaming platform with messaging
- Xbox Party Chat — Voice communication during gaming
- PlayStation Party Chat — Similar to Xbox
- Nintendo Switch Online — Voice chat capabilities
- Pinterest — Exempt from the ban
- Houseparty-style video chat apps — Group video calls
- Game-specific chat — Fortnite, Minecraft, and other games with built-in communication
The monitoring challenge: Parents who previously used Instagram’s parental controls to monitor their teens now have less visibility. Gaming platforms rarely offer the same supervision tools.
Category 4: Suspicious Or Less-Known Apps To Watch For
If you see small or unfamiliar apps with these features, investigate them:
- “Anonymous chat”
- “Random video chat”
- “Meet new friends”
- “Live rooms”
- “Community boards”
- “Open servers”
- Any app asking for age but not verifying it
Profile picture red flags:
Apps that allow profiles to be created with these as the main profile picture warrant extra scrutiny:
- No photograph required
- Cartoon avatars
- Animal/scenery/meme images
- AI-generated images
- Crowd scenes
- Pictures of dubious nature
Why this matters: Apps that don’t require real photos make it easier for adults to pose as children or for children to hide their identity from parents.
New apps appear constantly. Photo-sharing app Yope gained 100,000 Australian users by word of mouth as the ban approached. ByteDance’s Lemon8 is being promoted as a TikTok backup; both are now on the eSafety Commissioner’s watch list.
What Parents Should Do
Australian parents are navigating this in real time. Here’s what families on the ground are doing:
1. Regular device checks
Look for unfamiliar apps, especially in the “Social” or “Entertainment” categories. Many workaround apps disguise themselves as utilities or games.
2. Ask about chat features in games
Gaming platforms are where much social interaction has migrated. Ask your child: “Who can message you in this game? Can strangers contact you?”
3. Monitor data usage
Sudden spikes in mobile data or Wi-Fi usage can indicate new apps being heavily used.
4. Have ongoing conversations
Some Australian teens report feeling relieved and that “the pressure is gone,” while others are frustrated so keep talking about why the ban exists and what’s happening with their peers.
5. Watch for VPN apps
VPN companies have been advertising directly to Australian teens. Common VPN apps to look for: NordVPN, ExpressVPN, Surfshark, CyberGhost, ProtonVPN. The government insists platforms can detect VPN usage, but enforcement remains to be tested at scale.
The Safety Paradox Parents Are Facing
The unintended consequence parents are discovering is that children can still watch YouTube, scroll through TikTok, and browse Instagram, they just can’t log in.
This means they’re viewing everything without filters or age-appropriate protections. YouTube itself warned parents that its parental controls “only work when your teen is signed in.”
Parents have therefore lost their monitoring tools as the platforms’ built-in parental controls, screen time limits, and content filters all require logged-in accounts.
What Happens Next
Malaysia’s under-16 ban takes effect January 1, 2026. Norway implements its age-15 ban mid-2026 and Denmark is finalising its own plans. The European Union is running age verification pilots across five countries.
They’re all watching Australia to see if this actually works.
The questions everyone’s asking:
- Does VPN usage become normalised or remain fringe?
- Do alternative platforms get added to the banned list fast enough?
- Can age verification technology keep pace with determined teens?
- Does removing social media actually improve wellbeing, or just push teens into harder-to-monitor spaces?
Stanford University researchers are tracking affected teens’ mental health, sleep patterns, and behaviour changes over at least two years. The findings will be published for other countries to review.
We’re Tracking This Story
This monitoring framework will evolve as parents see what’s actually happening on the ground. We’ll be updating this resource and tracking Australia’s experiment week by week in our newsletter, Plugged In.
January 8, 2026: We’ll publish the first comprehensive four-week analysis, not Day One speculation, but evidence showing where teens actually migrated, which workarounds stuck and what families are experiencing.
https://wired-parents.beehiiv.com/ to get weekly updates on Australia’s ban, global policy developments, and what parents worldwide are deciding.
About this monitoring framework: Developed by Geoff Birkbeck (background in law enforcement and compliance) for child psychology professionals in Western Australia. Adapted here with permission.
About Wired Parents: We track what’s happening with children and technology so you can make informed decisions for your family. Every Thursday: safety updates, new research, and what’s happening worldwide.



