The jury in the first social media addiction trial is now calculating damages. Here’s what that means for parents.
The jury in the KGM v. Meta & YouTube trial in Los Angeles appears to have moved past the question of whether the platforms are liable for harming a young user, and is now working out how much they should pay. On 20 March 2026, jurors sent a question to Judge Carolyn Kuhl about the verdict form for compensatory damages and punitive conduct. To reach that stage, enough jurors had to agree that one or both platforms were negligently or harmfully designed.
For the first time, a jury appears to have concluded that the way social media platforms are built can make a company legally responsible for harm to a child.
It is not a clean result yet
There is a complication. On 23 March, after more than a week of deliberations, the jury told the judge it was having difficulty reaching consensus on one of the two defendants. They did not say which company. Judge Kuhl instructed them to keep deliberating, warning that failure to agree would mean a retrial. The jury returns to court today, 24 March.
Plaintiff’s attorney Mark Lanier acknowledged the significance of the damages question but urged caution. A split outcome is possible: a verdict against one company and a hung jury on the other.
Why it matters beyond this one case
This trial was selected as a bellwether, meaning its outcome is designed to signal how juries view the core legal arguments in thousands of similar cases. More than 10,000 individual claims and nearly 800 school district lawsuits are pending across the United States, all making similar allegations about addictive platform design. A finding of liability here would put enormous pressure on Meta, YouTube and the wider industry to settle. Legal experts have drawn direct comparisons to the tobacco litigation of the 1990s.
What this means for you right now
This trial does not change what your child can access today. No platform features have been altered, and any changes forced by a verdict would take months or years to implement. But there are things worth doing now.
Check what your child is actually using, and how old they were when they started. The plaintiff in this case created an Instagram account at nine with no meaningful age verification. If your child is on platforms they technically should not be, the platforms have not built reliable barriers to stop that.
Look at the design features, not just the content. This case is specifically about autoplay, infinite scroll, notification timing and algorithmic recommendations. These are the features that keep children scrolling past the point they want to stop. Parental controls that filter content but leave these engagement mechanisms untouched only address half the problem.
Talk to your child about what addictive design means. Older children and teenagers are often receptive to understanding the mechanics behind why they find it hard to put their phone down. It is a more productive conversation than telling them social media is bad.
A verdict could come at any point this week. We will update when it does.
Sources: MLex — jury pondering damages



