EU Accuses TikTok of Addictive Design That Harms Children

The European Commission found TikTok in breach of the Digital Services Act on February 6, 2026, for using addictive design features that harm children and vulnerable adults, with preliminary findings targeting infinite scroll, autoplay, push notifications, and TikTok’s highly personalised recommender system that regulators say put users’ brains on “autopilot” and encourage compulsive behaviour. TikTok could face fines up to 6% of its global annual turnover if the findings are upheld, with the company calling the accusations “categorically false and entirely meritless.”

This represents a different approach than countries banning social media for under-16s. Instead of restricting access by age, the European Commission is targeting how platforms are built, arguing that certain design features should be disabled or redesigned to reduce harm.


What the Commission Found

The investigation examined TikTok’s internal risk assessments, company data, and scientific research on behavioural addiction, with regulators concluding that TikTok failed to adequately assess how its addictive features could harm users’ physical and mental wellbeing. By constantly rewarding users with new content, certain design features fuel the urge to keep scrolling and shift users into autopilot mode, with scientific research showing this may lead to compulsive behaviour and reduce self-control.

TikTok disregarded important indicators of compulsive use including the time minors spend on the app at night, the frequency with which users open the app, and other potential addiction signals. A French parliamentary report showed 8% of 12-15 year-olds spending more than five hours daily on TikTok, a Danish study found users as young as eight using it ~2 hours daily on average, and a Polish study cited TikTok as the most used platform after midnight by 13-18 year-olds.


Why Current Safeguards Don’t Work

The Commission found TikTok’s existing tools don’t effectively reduce risks from addictive design. The Daily Screen Time feature allows users to set limits and receive notifications when reached, with a one-hour limit automatically set for users aged 13 to 17, but regulators said these safeguards are ineffective because warnings are “easy to dismiss” and introduce limited friction that users bypass within seconds.

Parental controls through the “Family Pairing” tool let parents customise safety settings, set screen-time limits, receive activity reports, and restrict search terms, but the Commission said these limits fail because they “require additional time and skills from parents to introduce the controls,” meaning parents shoulder the burden of counteracting design features specifically engineered to maximise engagement.


What the Commission Wants Changed

Regulators want TikTok to disable key addictive features such as infinite scroll over time and implement effective screen-time breaks including after midnight, with the Commission saying there should be mandatory screen time limits and lock-outs at night to avoid sleep deprivation. These changes would fundamentally alter how the platform works rather than relying on optional tools users easily bypass.

The EU regulator specifically noted that combating behavioural addiction by minors is a mandatory component of assessing risks under the Digital Services Act, with TikTok having ignored widespread evidence about excessive use, failed to properly assess these risks, and not properly mitigated mental health risks on its platform.


How This Differs from Bans

Countries like Australia, France, and Spain are implementing age-based bans that restrict who can access social media. The European Commission is targeting how platforms operate, arguing that design choices themselves create harm regardless of user age.

This approach questions whether engagement loops should be weakened by design through stronger screen-time breaks, reduced notification pressure, and changes to how recommendations amplify repeated viewing. If the Commission’s legal theory holds, it could signal a broader EU standard where certain engagement mechanics are acceptable only if platforms prove they’ve tested and implemented effective safeguards that reduce harm at scale.

Infinite feeds, autoplay, and algorithmic recommendations are common across platforms, which means TikTok’s case could set precedent for how other social media companies operate in Europe.


What Happens Next

These are preliminary findings, meaning no fines or penalties have been imposed yet, with TikTok now having the right to review the Commission’s findings and respond in writing with its own solutions. The Commission will also consult the European Board for Digital Services, an independent advisory group that applies DSA rules and regulations.

A TikTok spokesperson said the Commission’s preliminary findings present “a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings,” with the company denying that its design features create harm and arguing its safeguards are effective.

If the Commission ultimately confirms non-compliance, the Digital Services Act allows for fines up to 6% of global annual turnover and can impose binding orders requiring product changes. This means the EU could force TikTok to re-engineer features that keep people scrolling, not just disclose how they work.


What Parents Should Know

The Commission’s investigation focuses on design features common across social media platforms, with the question being whether regulators succeed in forcing changes to infinite scroll, autoplay, and algorithmic recommendations that will determine if this approach spreads beyond TikTok to other platforms children use.

The preliminary findings don’t require immediate changes to how TikTok operates, with the process taking months as TikTok responds and regulators review evidence. Parents shouldn’t expect platform changes in the short term, though the case signals how European regulators view the relationship between design features and harm to children.

Watch whether the Commission upholds its findings after TikTok’s response and whether other platforms face similar investigations. The core question is whether the EU will regulate design decisions that sit at the centre of advertising-driven business models, and whether mandated friction becomes Europe’s approach to child online safety.


Sources

European Commission, “Commission preliminarily finds TikTok’s addictive design in breach of the Digital Services Act,” February 6, 2026

Euronews, “TikTok’s ‘addictive’ design breaches EU law, European Commission says,” February 6, 2026RTE, “EU accuses TikTok of creating ‘addictive design’,” February 6, 2026

Al Jazeera, “European Union says video app TikTok must change ‘addictive’ design,” February 7, 2026

Related Articles

Tags:

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

FTC Investigating AI Chatbots Over Child Safety Concerns

FTC launches inquiry into AI companion chatbots from Meta, OpenAI, Character.AI and others. What parents need to know about chatbot risks for children.

What “Learning How to Learn” Actually Means for Your Child

Google's AI chief says 'learning how to learn' will be critical for the future—but what does this buzzword actually mean?

Why Some Australian Teens Are Actually Happy About the Social Media Ban

Not all Australian teens are fighting the social media ban. Some are quietly relieved.