Two Years Later, Bereaved Families Share What Parents Need to Know
It’s been two years since the United Kingdom passed groundbreaking legislation to protect children online, and the mothers who lost their children are still fighting.
What’s happening: On October 26, 2023, the UK’s Online Safety Act became law, introducing some of the strictest online child protection measures in the world. Now, two years later, bereaved families who pushed for the legislation say there’s progress—but the “monster” of online harm remains far bigger than most people realize.
Why this matters to all parents: The UK is leading the world in regulating social media platforms for child safety. What works—and what doesn’t—in their approach will shape legislation globally, including potential US federal laws. American parents can learn from both the successes and the gaps British families are experiencing.
The bigger picture: This isn’t just about one country’s law. It’s about whether government regulation can actually make children safer online, or whether tech companies will always find ways around restrictions. The UK is the test case the whole world is watching.
Here’s what’s actually changed for British families, what bereaved mothers wish they’d known, and what parents everywhere can take from their hard-won lessons.
What Parents Need to Know
The Families Behind the Fight
Lisa Kenevan’s 13-year-old son Isaac died in 2022 after what was believed to be an online challenge on TikTok. She’s now suing the platform as part of a group action with other British families who lost children to social media-related deaths.
Hollie Dance’s 12-year-old son Archie Battersbee died in August 2022. A coroner described his death as an accident during a “prank or experiment” that went wrong—content he likely encountered online.
Esther Ghey’s daughter Brianna, 16, was murdered by two teenagers who had viewed extreme content online. Brianna herself had been exposed to harmful material on social media platforms.
Together, these mothers—and others—have created the “Be Challenge Aware” campaign, visiting schools to warn children, parents, and teachers about online risks.
What the Online Safety Act Actually Does
The legislation brought in over 40 specific rules that platforms must follow:
Protection from dangerous challenges. Platforms must prevent children from accessing content showing or promoting dangerous viral challenges that could lead to injury or death.
Content restrictions. Tech companies must have measures to block children from seeing harmful material related to suicide, self-harm, eating disorders, violence, hate speech, and pornography.
Age verification. Since the Children’s Safety Codes launched in July 2025, major platforms have implemented age checks. All top ten UK pornography sites now require age verification.
Enforcement by Ofcom. The UK’s communications regulator can fine companies up to £18 million or 10% of global revenue for failures.
What’s Actually Changed
According to Ofcom, the regulator enforcing the Act:
- Porn sites have age checks. All major adult sites now require age verification before access.
- Major platforms added protections. X (formerly Twitter), Reddit, Bluesky, Discord, and Grindr introduced age assurance to protect children.
- TikTok restricted content. The platform built technology that prevents users from accessing age-restricted content unless verified as over 18.
These are real improvements that didn’t exist two years ago.
What Hasn’t Changed Enough
Lisa Kenevan told reporters: “The monster we’re dealing with is still far bigger than most people realise.”
The law isn’t enough. While the Act has helped reduce some exposure to illegal content, tech moves faster than regulation. New platforms, features, and workarounds emerge constantly.
Enforcement is slow. It took until July 2025—nearly two years after the law passed—for the Children’s Safety Codes to take effect. By then, hundreds more children had been harmed.
Kids find ways around it. VPNs, fake birthdates, and other workarounds remain common. Age verification isn’t foolproof.
Algorithm problems persist. Even with restrictions, algorithms still surface harmful content through recommendations and search functions.
What Other Parents Are Doing
British families are taking matters into their own hands:
School phone bans are gaining traction. Following Esther Ghey’s “Phone Free Education” campaign, more schools are implementing complete phone bans during school hours—not just classroom bans.
Contract-based approaches. Many families are creating written agreements with their teens about online use, with specific consequences for accessing restricted content or using workarounds.
Community education. Parent groups are organizing talks at schools, sharing information about emerging challenges and apps, and creating support networks.
Digital detox weekends. Some families are establishing regular periods where all devices go away for everyone—parents included—to reset relationships with technology.
Direct platform engagement. Rather than hoping for regulation to work, some parents are directly contacting platforms when they find harmful content, creating paper trails of complaints.
How This Affects Your Family
Even If You’re Not in the UK
American parents should pay attention because:
State laws are copying the UK model. Several US states are drafting or passing legislation inspired by the Online Safety Act. Understanding what works helps you advocate for effective laws.
Platforms make global changes. When major companies implement safety features for UK users, they often roll them out worldwide. You may already be benefiting from UK-inspired protections.
The challenges are universal. The online dangers British children face—challenges, harmful content, algorithm manipulation—are the same ones affecting American kids.
Lessons from Bereaved Families
These mothers have learned devastating lessons. They want other parents to know:
“It’s not about ‘screen time.'” The amount of time matters less than what children access during that time. A child can encounter life-threatening content in 30 seconds.
“Algorithms don’t care about your child.” Platforms optimise for engagement, not safety. If harmful content keeps users scrolling, the algorithm will serve more of it.
“Your child won’t tell you.” Most kids who encounter disturbing content don’t report it to parents, either because they’re embarrassed, don’t realise how serious it is, or fear losing device privileges.
“By the time you find out, it may be too late.” These mothers discovered what their children were viewing only after tragedy struck.
“Trust the law, but verify compliance.” Just because platforms say they’re following regulations doesn’t mean they are. Parents still need to be vigilant.
Warning Signs That Demand Immediate Action
Based on the bereaved families’ experiences, take it seriously if your child:
- Suddenly becomes secretive about online activity
- Shows interest in dangerous stunts or “challenges”
- Has unexplained injuries they won’t discuss
- Rapidly switches screens when you enter the room
- Shows signs of depression or withdrawal after device use
- Mentions online content that seems disturbing but dismisses your concerns
Don’t wait. Talk to them immediately, check their devices if necessary, and contact professionals if you’re concerned.
Practical Steps for Non-UK Parents
Assume your child will encounter harmful content. The question isn’t if, but when and how they’ll respond.
Have the “dangerous challenge” conversation. Specifically discuss viral challenges, dares, and pranks. Explain that some challenges circulating online have killed children.
Create a “no judgment” reporting system. Tell your child they can show you anything disturbing without punishment. Fear of consequences keeps kids silent.
Know their passwords. For younger teens especially, parents should have access to check accounts periodically.
Use available safety features. Every major platform has parental controls and content filters. They’re imperfect, but use them anyway.
Join parent networks. Local parent groups often share information about dangerous trends before media catches on.
The Mothers’ Message to Other Parents
At a recent House of Commons reception, Lisa, Hollie, and Esther delivered a unified message: government action alone isn’t enough.
“We need parents who are informed, engaged, and willing to have difficult conversations,” Lisa said. “The law creates accountability for tech companies, but it can’t replace active parenting.”
Hollie added: “We visit schools and see the same thing everywhere—parents think their child is the exception, that dangerous content happens to other families. That’s what we thought too.”
Esther’s advice is direct: “Look at your child’s phone. Have uncomfortable conversations. Set boundaries even when they push back. We didn’t know what our children were seeing until it was too late. You still have time.”
Their Ongoing Advocacy
These families aren’t finished:
Be Challenge Aware campaign continues visiting schools, reaching thousands of students, parents, and teachers about recognizing online risks.
Lawsuit against TikTok progresses through courts, with a fifth British family recently joining the group action.
Push for stronger enforcement. They’re lobbying for faster regulatory responses and heavier penalties for platforms that fail to comply.
Support for Phone Free Education. Backing Esther’s campaign for complete phone bans in schools.
What’s Next for the Online Safety Act
Ofcom is reviewing compliance and may strengthen regulations further. Potential upcoming changes include:
- Stricter age verification requirements
- Mandatory reporting of child safety incidents
- Algorithm audits to identify how harmful content spreads
- Greater transparency about platform compliance
- Increased fines for violations
But Lisa Kenevan warns: “No law is water-tight, and tech is moving at such a fast pace. The Act has helped, but we can’t become complacent.”
The Bottom Line
Two years after the UK’s groundbreaking Online Safety Act, children are safer in some ways—age checks on porn sites, restricted access to certain content, platform accountability. But they’re still vulnerable in others—algorithm manipulation, new challenges emerging constantly, and the reality that determined kids can circumvent most protections.
The bereaved mothers who fought for this law know better than anyone: legislation matters, but it’s not enough. Active parenting, open communication, and community awareness remain the strongest protections we have.
Learn from their loss. Don’t wait for tragedy to take online safety seriously.
Stay Informed Without the Overwhelm
Get the week’s most important digital parenting news delivered to your inbox every Thursday. No noise, just what matters to families.
Get Plugged In every Thursday →


