You turned on TikTok’s “Restricted Mode” thinking it would protect your child from inappropriate content. New research shows it’s doing the opposite.
What’s happening: A Global Witness investigation revealed that TikTok actively suggests pornographic content to brand-new accounts registered as 13-year-olds – even when the app’s protective “Restricted Mode” is enabled.
Why this matters to all parents: If you’re relying on TikTok’s built-in safety features, you need to know they’re not working. This isn’t about kids seeking out inappropriate content – TikTok’s algorithms are pushing it to them before they’ve even searched for anything.
The bigger picture: This represents a fundamental failure of the parental controls millions of families trust to keep their children safe online, and potentially violates the UK’s Online Safety Act that came into force in July 2025.
Here’s what researchers discovered, what this means for your family, and what you can do right now.
What Parents Need to Know
Researchers from Global Witness created seven fake TikTok accounts posing as 13-year-olds using brand-new phones with no search history. They turned on “Restricted Mode” – the feature TikTok says will protect users from “sexually suggestive content.”
The immediate facts:
- For three accounts, sexually explicit search suggestions appeared the instant they clicked the search bar
- Pornographic content was accessible within just two clicks
- Search suggestions included phrases like “very rude babes,” “hardcore pawn clips,” and “TikTok late night for adults”
- The pornography ranged from women exposing themselves to full penetrative sex videos
- Some explicit content was cleverly edited into innocent-looking clips to bypass filters
Why this matters to parents: This wasn’t a one-off glitch. Global Witness first reported similar findings to TikTok in January 2025. The company claimed they fixed it. When researchers tested again in July 2025 (after the UK’s Online Safety Act came into force), the same problems persisted.
What experts are saying: Media lawyer Mark Stephens CBE states: “In my view these findings represent a clear breach of the Online Safety Act. It’s now on Ofcom to investigate and act swiftly to make sure this new legislation does what it was designed to do.”
Different perspectives: TikTok says it took down more than 90 pieces of content and removed search suggestions after being contacted. They claim to be “reviewing youth safety strategies.” However, the fact that this issue persists months after being reported raises serious questions about whether TikTok’s protective measures can be trusted.
What Other Parents Are Discovering
This investigation confirms what many parents have suspected. TikTok users themselves have been posting screenshots of inappropriate search suggestions, with captions like:
- “Can someone explain to me what is up w my search recs pls”
- “I THOUGHT I WAS THE ONLY ONE”
- “How tf do you get rid of it like I haven’t even searched for it”
What parents are doing: Some families are deleting TikTok entirely after learning that “Restricted Mode” doesn’t provide real protection. Others are implementing stricter oversight, checking their children’s search history and watching content together.
What’s not working: Simply trusting app-based parental controls. This investigation proves that TikTok’s built-in safety features are fundamentally broken – the algorithms are designed to maximize engagement, not protect children.
How This Affects Your Family
For parents of younger teens (13-15):
- Do not rely on “Restricted Mode” to protect your child
- Consider whether your child needs TikTok at all
- If they do use it, regular check-ins on what they’re seeing are essential
- Be prepared to have frank conversations about pornography – they may have already been exposed
For parents of older teens (16-17):
- Discuss this issue openly – they need to understand that algorithms don’t have their best interests at heart
- Talk about the difference between accidentally encountering pornography and actively seeking it
- Help them understand how to recognize when they’re being manipulated by platform design
Practical next steps:
- Check if your child has TikTok and whether “Restricted Mode” is on
- Ask them directly if they’ve encountered inappropriate search suggestions
- Consider third-party parental control software that monitors across multiple apps
- Have age-appropriate conversations about online safety and pornography
Warning signs to watch for:
- Secretive phone use
- Quickly switching apps when you enter the room
- Increased time on TikTok, especially late at night
- Changes in behavior or sudden maturity in language
Sources: BBC, Global Witness,



