Your teen uses Facebook and Messenger and you’re wondering if the safety features you set up are actually doing anything.
What happened: Meta announced on 25 September 2025 that Teen Accounts are now rolling out globally to all teens using Facebook and Messenger, after initially being available only in the US, UK, Australia, and Canada. The accounts, which first launched on Instagram last autumn, automatically place teens into an experience designed to limit inappropriate content and unwanted contact. Teens under 16 need their parents’ permission to change any safety settings. Meta says hundreds of millions of teens are now in Teen Accounts across Instagram, Facebook, and Messenger.
Read more: Teen Accounts Expand to Facebook and Messenger with New Protections – Meta
What’s changing:
Teens will automatically be placed into Teen Accounts with built-in restrictions, regardless of where they live. They can only receive messages from people they follow or have messaged before, significantly reducing random contact from strangers. Only friends can see and reply to their stories, and tags and mentions are limited to people they follow or have as friends. Teens receive reminders to leave the platform after an hour of use and are automatically placed into “Quiet Mode” at night, which mutes notifications and limits activity.
For teens under 16, any changes to these safety settings require parental permission first. Meta uses AI to detect teens who may be lying about their age, analysing factors like who follows them, who they follow, and what content they interact with. The company claims 97% of teens aged 13-15 have kept their default restrictions on Instagram, suggesting most aren’t trying to bypass the protections. Meta also announced a School Partnership Programme allowing US schools to report bullying and safety concerns directly to Instagram for priority review within 48 hours.
What parents are saying:
Some feel relieved that protections are being standardised across all Meta platforms, as managing different safety settings for each app was confusing. Others remain sceptical given recent reports showing that despite Teen Accounts, young users can still come across self-harm posts and other harmful content on Instagram. Parents are questioning whether these automatic protections actually work or if they’re mainly for show, especially after the Instagram safety tools report finding 64% of protections ineffective.
What to consider:
If your teen uses Facebook or Messenger, they should automatically be moved into Teen Accounts with these restrictions. Check with them whether they’ve noticed any changes, particularly in who can message them or see their content. The fact that teens under 16 need parental permission to change settings means you should receive notification if your child tries to loosen protections – pay attention to those requests. Meta’s AI age detection is getting better, but determined teens can still potentially bypass it by creating accounts with false birthdates before these systems catch them.
Related: Instagram parental controls don’t work | FTC AI chatbot inquiry



