Children Are Being Sent Porn on TikTok Despite “Restricted Mode”

You turned on TikTok’s “Restricted Mode” thinking it would protect your child from inappropriate content. New research shows it’s doing the opposite.

What’s happening: A Global Witness investigation revealed that TikTok actively suggests pornographic content to brand-new accounts registered as 13-year-olds – even when the app’s protective “Restricted Mode” is enabled.

Why this matters to all parents: If you’re relying on TikTok’s built-in safety features, you need to know they’re not working. This isn’t about kids seeking out inappropriate content – TikTok’s algorithms are pushing it to them before they’ve even searched for anything.

The bigger picture: This represents a fundamental failure of the parental controls millions of families trust to keep their children safe online, and potentially violates the UK’s Online Safety Act that came into force in July 2025.

Here’s what researchers discovered, what this means for your family, and what you can do right now.


What Parents Need to Know

Researchers from Global Witness created seven fake TikTok accounts posing as 13-year-olds using brand-new phones with no search history. They turned on “Restricted Mode” – the feature TikTok says will protect users from “sexually suggestive content.”

The immediate facts:

  • For three accounts, sexually explicit search suggestions appeared the instant they clicked the search bar
  • Pornographic content was accessible within just two clicks
  • Search suggestions included phrases like “very rude babes,” “hardcore pawn clips,” and “TikTok late night for adults”
  • The pornography ranged from women exposing themselves to full penetrative sex videos
  • Some explicit content was cleverly edited into innocent-looking clips to bypass filters

Why this matters to parents: This wasn’t a one-off glitch. Global Witness first reported similar findings to TikTok in January 2025. The company claimed they fixed it. When researchers tested again in July 2025 (after the UK’s Online Safety Act came into force), the same problems persisted.

What experts are saying: Media lawyer Mark Stephens CBE states: “In my view these findings represent a clear breach of the Online Safety Act. It’s now on Ofcom to investigate and act swiftly to make sure this new legislation does what it was designed to do.”

Different perspectives: TikTok says it took down more than 90 pieces of content and removed search suggestions after being contacted. They claim to be “reviewing youth safety strategies.” However, the fact that this issue persists months after being reported raises serious questions about whether TikTok’s protective measures can be trusted.


What Other Parents Are Discovering

This investigation confirms what many parents have suspected. TikTok users themselves have been posting screenshots of inappropriate search suggestions, with captions like:

  • “Can someone explain to me what is up w my search recs pls”
  • “I THOUGHT I WAS THE ONLY ONE”
  • “How tf do you get rid of it like I haven’t even searched for it”

What parents are doing: Some families are deleting TikTok entirely after learning that “Restricted Mode” doesn’t provide real protection. Others are implementing stricter oversight, checking their children’s search history and watching content together.

What’s not working: Simply trusting app-based parental controls. This investigation proves that TikTok’s built-in safety features are fundamentally broken – the algorithms are designed to maximize engagement, not protect children.


How This Affects Your Family

For parents of younger teens (13-15):

  • Do not rely on “Restricted Mode” to protect your child
  • Consider whether your child needs TikTok at all
  • If they do use it, regular check-ins on what they’re seeing are essential
  • Be prepared to have frank conversations about pornography – they may have already been exposed

For parents of older teens (16-17):

  • Discuss this issue openly – they need to understand that algorithms don’t have their best interests at heart
  • Talk about the difference between accidentally encountering pornography and actively seeking it
  • Help them understand how to recognize when they’re being manipulated by platform design

Practical next steps:

  1. Check if your child has TikTok and whether “Restricted Mode” is on
  2. Ask them directly if they’ve encountered inappropriate search suggestions
  3. Consider third-party parental control software that monitors across multiple apps
  4. Have age-appropriate conversations about online safety and pornography

Warning signs to watch for:

  • Secretive phone use
  • Quickly switching apps when you enter the room
  • Increased time on TikTok, especially late at night
  • Changes in behavior or sudden maturity in language

Sources: BBC, Global Witness,

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

FTC Investigating AI Chatbots Over Child Safety Concerns

FTC launches inquiry into AI companion chatbots from Meta, OpenAI, Character.AI and others. What parents need to know about chatbot risks for children.

What “Learning How to Learn” Actually Means for Your Child

Google's AI chief says 'learning how to learn' will be critical for the future—but what does this buzzword actually mean?

Why Some Australian Teens Are Actually Happy About the Social Media Ban

Not all Australian teens are fighting the social media ban. Some are quietly relieved.