The EU just put a social media age limit on the table

The EU just put a social media age limit on the table

The European Commission could propose an EU-wide social media age limit for children as early as this summer. That is what Ursula von der Leyen, the Commission’s President, told delegates at the European Summit on Artificial Intelligence and Children in Copenhagen on 12 May 2026. It is the clearest signal yet that one of the world’s biggest regulators is moving towards the same kind of restriction Australia introduced in December 2025.

For parents anywhere in the world, this matters less because of what the EU does in its own borders and more because of what it tells you about where the platforms your child uses are heading.

What von der Leyen actually said

Von der Leyen said the Commission is waiting for findings from its Special Panel of experts on Child Safety Online, but added: “Without pre-empting the panel’s findings, I believe we must consider a social media delay. Depending on the results, we could come forward with a legal proposal this summer.”

She also said discussions about a minimum age for social media “can no longer be ignored.” That is a significant shift in tone from a Commission that, until now, has stopped short of endorsing age-based bans and preferred to focus on platform duties.

Two other things from the speech matter:

The EU’s age-verification app is technically ready. Built on the same model as the Digital COVID Certificate and using a method that confirms whether someone is above an age threshold without revealing their identity, it has been formally recommended to member states. Apple and Google have been instructed to integrate it at the operating-system level.

The Digital Fairness Act, expected later this year, targets a separate problem: how platforms are designed to keep users scrolling. The proposed law would restrict what von der Leyen called “addictive design”: endless scroll that never gives you a stopping point, autoplay that starts the next video before you decide to keep watching, push notifications engineered to pull you back into the app. None of these are illegal now. The Digital Fairness Act would make them regulated, with platforms required to remove or limit them.

This is separate from the age question. It would apply to every user, but children are most affected because they have the least ability to resist these patterns. She named TikTok directly.

Where this fits with everything else

The EU is not acting in isolation. Australia’s under-16 ban came into force five months ago. France approved an under-15 ban in January 2026. Spain, Denmark, Slovenia, Austria, Italy, and Ireland are all drafting national rules. The European Parliament has separately called for a uniform 16-year minimum across the bloc.

Meanwhile, the Commission has open investigations against Meta, TikTok, X, and Snap under the Digital Services Act, focused on how those platforms handle minors. Some are expected to produce findings within the next twelve months.

Why this matters wherever you live

The EU is one of the largest single markets in the world for the platforms your child uses. When the EU passes a rule that affects how Instagram, TikTok, YouTube, Snapchat, or X serve minors, those platforms rarely build a separate product for every jurisdiction. The default tends to become global, because running parallel systems is expensive. This is what happened with the GDPR. It is what is now happening with the Digital Services Act.

That does not mean a global under-16 ban is coming. It means the rules in the largest single market are moving from “platforms must mitigate risks to children” towards “platforms must keep children off.” If you have been watching Australia and wondering whether what’s happening there is an outlier, this week’s news suggests it is not.

What to do this week

Check what’s already there. Before the platforms change anything in response to this, get familiar with the parental controls and teen settings that already exist on the apps your child uses: Instagram’s Teen Accounts, TikTok’s Family Pairing, Snapchat’s Family Centre, YouTube’s supervised accounts, Roblox’s parental controls. If the Digital Fairness Act passes, the defaults will likely shift. Knowing the starting point makes the changes easier to spot.

Watch the Digital Fairness Act separately. The age limit is the headline. The Digital Fairness Act is the substance. It is more likely to land first, and it would change how the platforms your child uses actually work, not who is allowed on them.

Use this as a conversation prompt. “Why do you think governments are talking about banning social media for kids?” is a useful question for a child aged ten and up. The answer should not start with “because it’s bad for you.” It should start with “what do you think?” Children who already use these platforms have views worth hearing before any rule reaches them.

What’s still unknown

The expert panel’s findings have not been published. The age threshold is unconfirmed: the European Parliament wants 16, but the Commission has not committed to a number. The Digital Fairness Act’s final scope is still being worked out. Any legal proposal would still need to pass through the EU’s legislative process, which is rarely fast.

Expect platform announcements over the summer. Meta did this in 2024 with Teen Accounts. Roblox is doing it now. Some of what arrives will contain real changes worth knowing about. We’ll keep tracking it.

Sources: European Commission — Keynote address by President von der Leyen at the European Summit on Artificial Intelligence and Children, 12 May 2026

Euronews — Von der Leyen opens door to EU-wide social media ban for children, 12 May 2026

The Next Web — Ursula von der Leyen pushes EU-wide social-media age protections for children, 12 May 2026

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

FTC Investigating AI Chatbots Over Child Safety Concerns

FTC launches inquiry into AI companion chatbots from Meta, OpenAI, Character.AI and others. What parents need to know about chatbot risks for children.

What “Learning How to Learn” Actually Means for Your Child

Google's AI chief says 'learning how to learn' will be critical for the future—but what does this buzzword actually mean?

Why Some Australian Teens Are Actually Happy About the Social Media Ban

Not all Australian teens are fighting the social media ban. Some are quietly relieved.