Most research into children’s technology harms involves adults observing and measuring what’s happening to young people. A Drexel University study published in early April 2026 did something different: it went to where teenagers were already talking about their own experience. The team analysed more than 300 Reddit posts written by self-identified 13 to 17 year olds describing their use of Character.AI. What they found was not ambiguous.
What the teenagers said
Researchers were looking for six established markers of behavioural addiction: salience (the app dominates your thinking), tolerance (you need more of it over time), withdrawal (you feel bad when you can’t use it), relapse (you keep going back after trying to stop), conflict (it causes problems in your life), and mood modification (you use it to change how you feel). All six were present — not in the researchers’ assessment, but in the teenagers’ own words.
The real-world consequences reported were concrete: sleep loss, falling grades, strained friendships and family relationships. One teenager wrote: “I want to have my normal brain back, where I can just deal with my emotions on my own.” That sentence describes precisely what the researchers were measuring.
Why this study is different
These teenagers posted about their Character.AI use voluntarily, in a space they thought of as their own. The researchers read what they were already saying. That matters because the gap between what parents worry about and what teenagers acknowledge is real in a lot of the technology debate. On AI companion apps, that gap appears to be smaller than expected — teenagers are raising the alarm about themselves.
The study focused on Reddit posts, which means it captures teenagers engaged enough to write about it publicly. It doesn’t capture the broader population of users who may not have noticed a problem. That’s worth knowing. It doesn’t change the core finding.
What to do if your child uses Character.AI or similar apps
Character.AI is the most prominent platform in this category but not the only one. Replika, Kindroid and a growing range of similar apps are built on the same premise: an AI companion you can talk to, build a relationship with, and personalise. They are designed to feel responsive and emotionally attuned. Those qualities are exactly what makes them effective — and exactly what the addiction literature has consistently identified as risk factors.
Three things are worth doing now. Ask about it directly and without alarm: “I read something about Character.AI — do you use it? What’s it like?” Teenagers who are already noticing the patterns the Drexel study describes may be relieved to be asked. Look at how the app fits into the rest of their day — is it being used alongside human connection or instead of it? And check whether the app has any built-in time limits or parental oversight features. Most don’t.
Sources: Drexel University — Character.AI addiction study, April 2026 Cybernews — coverage of Drexel findings, April 2026



