The UK now has the power to restrict children’s social media

The UK government now has the legal power to restrict how children under 16 use social media — not just in principle, but in law. The Children’s Wellbeing and Schools Act received Royal Assent on 29 April 2026, and with it came a new power to introduce age-based or feature-based restrictions for children on social media platforms without needing to pass a new Act of Parliament. The machinery is in place. What happens next is a question of months, not years.

What the Act actually does

The new Act does not introduce an immediate social media ban for under-16s. What it does is create a legal framework that allows the Secretary of State to introduce restrictions through secondary legislation — meaning regulations can be made quickly, without a full parliamentary process, once the government decides what those restrictions should be.

The government has committed to publishing a progress report within three months of Royal Assent, with regulations to be laid within 12 months after that. Ministers have signalled they want to move faster and aim to have the first regulations in place by the end of 2026. The Liberal Democrats argued during debates that the overall timeline was still too slow; the government’s position is that the consultation process — which closes 26 May 2026 — needs to conclude before the shape of any restrictions is finalised.

The Act also does something immediately concrete: all schools in England are now legally required to follow the government’s mobile phone ban guidance. Previously this was advisory. It is now statutory. Teachers and school leaders have the clarity they have been asking for.

What restrictions might look like

The government has been clear that some form of restriction is coming for under-16s, regardless of what the consultation finds. What remains to be determined is which features or which platforms will be targeted.

The options on the table — laid out in the “Growing Up in the Online World” consultation — include a minimum age for social media, restrictions on specific addictive design features (infinite scroll, autoplay, algorithmic recommendations), overnight curfews, and raising the digital age of consent from 13. The consultation has already received over 45,000 responses, including nearly 6,000 from young people themselves.

The Crime and Policing Act, which also received Royal Assent on 29 April, separately grants new powers to bring more AI chatbots within the scope of the Online Safety Act — and criminalises the creation and supply of so-called “nudification tools,” software designed to generate non-consensual intimate images.

Where Ofcom fits in

At the same time as the Act was completing its parliamentary passage, Ofcom’s deadline for the six major platforms — Facebook, Instagram, TikTok, Snapchat, YouTube, and Roblox — to respond to its four child safety demands passed on 30 April 2026. Ofcom has said it will publish a report on those responses this month, alongside new research on how children’s online experiences have changed during the first year of the Online Safety Act.

If Ofcom is not satisfied with what the platforms have committed to, it has said it will take enforcement action — fines of up to £18 million or 10% of global turnover, whichever is greater. The two processes — the new Act creating powers for the government, and Ofcom enforcing existing duties on platforms — are running in parallel and are likely to feed into each other.

What to do today

Nothing changes on your child’s phone today. But there are two things worth doing now.

If you want to have a say in what the restrictions look like, respond to the consultation before 26 May 2026. It takes about 10 minutes and is open to anyone in the UK — parents, young people, and organisations. The government has said the consultation findings will directly shape the regulations. Search “Growing Up in the Online World consultation” to find the response form.

Check your child’s date of birth on every platform they use. Ofcom’s own data shows that the age-based protections platforms offer — Teen Account settings on Instagram, Family Pairing on TikTok, restricted mode on YouTube — are only applied when the platform knows a user is a child. If your child signed up with a false birthday to get around the age limit, the protections are not active on their account. Log in to each platform, go to account settings, and check.

The legal framework for meaningful change is now in place. The next few months — the Ofcom report, the close of the consultation, the government’s progress statement — will tell us what that change actually looks like in practice. I’ll be covering each step as it happens.

Sources: GOV.UK — Children’s Wellbeing and Schools Act receives Royal Assent, 29 April 2026

Bird & Bird — Legal briefing on the Act and AI chatbot powers, 1 May 2026

Ofcom — Keep underage children off your platforms, 12 March 2026

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

FTC Investigating AI Chatbots Over Child Safety Concerns

FTC launches inquiry into AI companion chatbots from Meta, OpenAI, Character.AI and others. What parents need to know about chatbot risks for children.

What “Learning How to Learn” Actually Means for Your Child

Google's AI chief says 'learning how to learn' will be critical for the future—but what does this buzzword actually mean?

Why Some Australian Teens Are Actually Happy About the Social Media Ban

Not all Australian teens are fighting the social media ban. Some are quietly relieved.