- Internal documents claim TikTok knows its algorithm can addict teens in just 35 minutes.
- Addictive “habit moment” occurs after approx 260 videos, often within one session.
- Documents show features aimed at youth safety were ineffective PR moves.
- Executives allegedly prioritised media image over real safety improvements.
- Lawsuit says TikTok’s algorithm previously dow-nranked “unattractive” users and missed harmful content.
TikTok’s powerful algorithm can make teens dependent on the platform in less than 35 minutes, according to internal documents exposed in a US lawsuit involving 13 states. The revelations show that TikTok refers to a “habit moment” that typically kicks in after a new user views approximately 260 short videos—something that can happen in a single 30-minute scroll session.
The material, inadvertently released by the Kentucky Attorney General’s office, also accuses TikTok of using safety features as little more than PR tactics. For instance, the app’s “60-minute screen time alert” for teens reportedly shaved off just 90 seconds from average daily use and was greenlit only if it didn’t affect TikTok’s core engagement metrics.
Even more concerning: internal discussions allegedly confirmed that the platform once altered its algorithm to suppress “unattractive” users and has high “leakage rates”—failing to detect roughly 36% of content normalising paedophilia and 50% that glorifies sexual assault of minors. Moderation metrics, executives admitted, were misleading because they only reflected caught—not missed—harmful content.
Child safety advocates like the Molly Rose Foundation and NSPCC are calling for swift regulatory action under the UK’s new Online Safety Act. They argue TikTok’s model is knowingly exploitative of young users and that content moderation failures must be addressed as part of broader reform. TikTok denies wrongdoing and says it prioritises safety, but the revelations could put increasing pressure on Ofcom and other regulators to act decisively.
The Times -> Full article



