EU Accuses TikTok of Addictive Design That Harms Children

The European Commission found TikTok in breach of the Digital Services Act on February 6, 2026, for using addictive design features that harm children and vulnerable adults, with preliminary findings targeting infinite scroll, autoplay, push notifications, and TikTok’s highly personalised recommender system that regulators say put users’ brains on “autopilot” and encourage compulsive behaviour. TikTok could face fines up to 6% of its global annual turnover if the findings are upheld, with the company calling the accusations “categorically false and entirely meritless.”

This represents a different approach than countries banning social media for under-16s. Instead of restricting access by age, the European Commission is targeting how platforms are built, arguing that certain design features should be disabled or redesigned to reduce harm.


What the Commission Found

The investigation examined TikTok’s internal risk assessments, company data, and scientific research on behavioural addiction, with regulators concluding that TikTok failed to adequately assess how its addictive features could harm users’ physical and mental wellbeing. By constantly rewarding users with new content, certain design features fuel the urge to keep scrolling and shift users into autopilot mode, with scientific research showing this may lead to compulsive behaviour and reduce self-control.

TikTok disregarded important indicators of compulsive use including the time minors spend on the app at night, the frequency with which users open the app, and other potential addiction signals. A French parliamentary report showed 8% of 12-15 year-olds spending more than five hours daily on TikTok, a Danish study found users as young as eight using it ~2 hours daily on average, and a Polish study cited TikTok as the most used platform after midnight by 13-18 year-olds.


Why Current Safeguards Don’t Work

The Commission found TikTok’s existing tools don’t effectively reduce risks from addictive design. The Daily Screen Time feature allows users to set limits and receive notifications when reached, with a one-hour limit automatically set for users aged 13 to 17, but regulators said these safeguards are ineffective because warnings are “easy to dismiss” and introduce limited friction that users bypass within seconds.

Parental controls through the “Family Pairing” tool let parents customise safety settings, set screen-time limits, receive activity reports, and restrict search terms, but the Commission said these limits fail because they “require additional time and skills from parents to introduce the controls,” meaning parents shoulder the burden of counteracting design features specifically engineered to maximise engagement.


What the Commission Wants Changed

Regulators want TikTok to disable key addictive features such as infinite scroll over time and implement effective screen-time breaks including after midnight, with the Commission saying there should be mandatory screen time limits and lock-outs at night to avoid sleep deprivation. These changes would fundamentally alter how the platform works rather than relying on optional tools users easily bypass.

The EU regulator specifically noted that combating behavioural addiction by minors is a mandatory component of assessing risks under the Digital Services Act, with TikTok having ignored widespread evidence about excessive use, failed to properly assess these risks, and not properly mitigated mental health risks on its platform.


How This Differs from Bans

Countries like Australia, France, and Spain are implementing age-based bans that restrict who can access social media. The European Commission is targeting how platforms operate, arguing that design choices themselves create harm regardless of user age.

This approach questions whether engagement loops should be weakened by design through stronger screen-time breaks, reduced notification pressure, and changes to how recommendations amplify repeated viewing. If the Commission’s legal theory holds, it could signal a broader EU standard where certain engagement mechanics are acceptable only if platforms prove they’ve tested and implemented effective safeguards that reduce harm at scale.

Infinite feeds, autoplay, and algorithmic recommendations are common across platforms, which means TikTok’s case could set precedent for how other social media companies operate in Europe.


What Happens Next

These are preliminary findings, meaning no fines or penalties have been imposed yet, with TikTok now having the right to review the Commission’s findings and respond in writing with its own solutions. The Commission will also consult the European Board for Digital Services, an independent advisory group that applies DSA rules and regulations.

A TikTok spokesperson said the Commission’s preliminary findings present “a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings,” with the company denying that its design features create harm and arguing its safeguards are effective.

If the Commission ultimately confirms non-compliance, the Digital Services Act allows for fines up to 6% of global annual turnover and can impose binding orders requiring product changes. This means the EU could force TikTok to re-engineer features that keep people scrolling, not just disclose how they work.


What Parents Should Know

The Commission’s investigation focuses on design features common across social media platforms, with the question being whether regulators succeed in forcing changes to infinite scroll, autoplay, and algorithmic recommendations that will determine if this approach spreads beyond TikTok to other platforms children use.

The preliminary findings don’t require immediate changes to how TikTok operates, with the process taking months as TikTok responds and regulators review evidence. Parents shouldn’t expect platform changes in the short term, though the case signals how European regulators view the relationship between design features and harm to children.

Watch whether the Commission upholds its findings after TikTok’s response and whether other platforms face similar investigations. The core question is whether the EU will regulate design decisions that sit at the centre of advertising-driven business models, and whether mandated friction becomes Europe’s approach to child online safety.


Sources

European Commission, “Commission preliminarily finds TikTok’s addictive design in breach of the Digital Services Act,” February 6, 2026

Euronews, “TikTok’s ‘addictive’ design breaches EU law, European Commission says,” February 6, 2026RTE, “EU accuses TikTok of creating ‘addictive design’,” February 6, 2026

Al Jazeera, “European Union says video app TikTok must change ‘addictive’ design,” February 7, 2026

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

How To Stop Brain Rot By Age Group

Practical tips for parents to help your children avoid or minimise "brain rot" from overconsuming low-quality online content.

🛡️ UK’s New Online Safety Rules Go Live: A Landmark Moment for Child Protection

New online requirements in the UK to protect children

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

IYKYK: The Teen Texting Codes Every Parent Should Know

Parents may feel fluent in “LOL” and “BRB,” but today’s teens are using a new wave of texting codes.