UK Regulators Tell Social Media Platforms ‘Prove You’re Protecting Children’

If your child uses Instagram, TikTok, Snapchat, YouTube, Roblox, or Facebook, the UK’s two most powerful digital regulators have just told those platforms they are not doing enough to keep children safe. And they have given them a deadline to respond.

On 12 March 2026, Ofcom and the Information Commissioner’s Office (ICO) jointly wrote to six of the biggest platforms used by children in the UK. The letters set out four demands and gave the companies until 30 April to explain what they intend to do. Ofcom will report publicly on their responses in May and has said it will take enforcement action if it is not satisfied.

This is one of the most significant regulatory interventions under the UK’s Online Safety Act so far. Here is what they said, what it means, and what parents should take from it.

What Parents Need to Know

Ofcom’s chief executive, Dame Melanie Dawes, was direct in her assessment. She said these platforms are household names, but they are failing to put children’s safety at the heart of their products. She described a gap between what tech companies promise in private meetings with the regulator and what they are actually doing in public.

The letters set out four specific demands.

Enforce minimum ages properly. Ofcom’s own research shows that 72% of children aged 8 to 12 are accessing platforms that have a minimum age of 13. Self-declaration, where a child simply enters a date of birth, is not working. Both Ofcom and the ICO are calling on platforms to implement proper age assurance using technologies like facial age estimation, digital ID verification, or photo matching. The ICO’s open letter stated plainly that platforms must move beyond relying on children to self-declare their ages.

Stop strangers contacting children. Ofcom wants failsafe grooming protections, meaning strict controls that prevent unknown adults from being able to contact children they do not know. This includes using age assurance to verify the ages of users who try to make contact with younger accounts.

Make algorithms safer for children. Ofcom described algorithms as children’s main pathway to harm online. The regulator is issuing legally binding information requests to large platforms, requiring them to explain how their recommendation systems work and what safeguards are in place to prevent children from being shown harmful content.

Stop testing AI products on children. New AI tools are being launched regularly and used widely by children, without parents knowing whether they have been tested for safety. Ofcom expects platforms to notify the regulator that they have assessed the risk of significant updates before those updates are deployed. This is already a legal requirement under the Online Safety Act.

What This Means for Your Family

This is the first time UK regulators have publicly named the biggest platforms and told them, in effect, that they are failing. Ofcom has investigated nearly a hundred services since the Online Safety Act came into force last year. It has taken enforcement action against some smaller sites, fined a nudification site for lacking age checks, and opened a major investigation into X over the Grok AI chatbot generating sexualised images.

But the household names, the platforms your children actually use every day, have until now faced pressure behind closed doors. That changed on 12 March. Ofcom published its demands publicly and is encouraging the platforms to publish their responses.

For parents, the immediate takeaway is that the protections these platforms claim to offer are not yet being enforced consistently. If your child is under 13 and using Instagram, TikTok, YouTube, or Snapchat, the regulator’s own data says there is a strong chance they got through with a false birthday. The protections designed for teen accounts may not be active on their account at all.

What you can do now: Check every platform your child uses and confirm that their account is registered with the correct date of birth. If it is not, the age-appropriate protections those platforms say they provide may not be applied. On Instagram, this means Teen Account settings are inactive. On YouTube, it means restricted mode and supervision features may not be triggered. On TikTok, it means Family Pairing protections and the default 60-minute time limit for under-18s may not apply.

This is not a permanent fix. The regulators are clear that the responsibility lies with the platforms, not with parents. But until the platforms act, checking the basics is the most practical thing you can do.

Free Guide

Your child’s going to ask. Be ready.

The Download is our free guide to the conversations every parent faces — phones, WhatsApp, Instagram, TikTok, YouTube, Roblox, AI and screen time. What the research says, what your options are, and what’s actually worth worrying about.

Get The Download — Free Delivered by email. Unsubscribe any time.

What Happens Next

The platforms have until 30 April 2026 to respond. Ofcom will publish a report on their responses in May, alongside new research on how children’s online experiences have changed during the first year of the Online Safety Act.

If Ofcom is not satisfied, it has several options. It can issue fines of up to £18 million or 10% of global turnover, whichever is greater. It can take business disruption measures. It can also strengthen the regulatory requirements under its industry codes to force further change.

Separately, the UK government’s “Growing Up in the Online World” consultation remains open until 26 May 2026. That consultation is asking whether the UK should go further, including a possible ban on social media for children under 16, restrictions on addictive design features, and raising the digital age of consent from 13. The results of the Ofcom enforcement and the consultation are likely to feed into new legislation, potentially as early as autumn 2026.


Sources:


Want this delivered every Thursday?

Wired Parents covers what’s happening with phones, social media, gaming and screen time, and what it means for your family. Every Thursday. Free.

Stay ahead →

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

How To Stop Brain Rot By Age Group

Practical tips for parents to help your children avoid or minimise "brain rot" from overconsuming low-quality online content.

🛡️ UK’s New Online Safety Rules Go Live: A Landmark Moment for Child Protection

New online requirements in the UK to protect children

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

IYKYK: The Teen Texting Codes Every Parent Should Know

Parents may feel fluent in “LOL” and “BRB,” but today’s teens are using a new wave of texting codes.