This is one of three major child safety developments this week:
- Italian Families Sue Meta & TikTok ← You’re here
- Kentucky Sues Roblox Over Child Predators
- Australia Issues Urgent Warning: Kids Exposed to Graphic Violence ━━━━━━━━━━━━━━━━━━━━━━━━
Yesterday, a group of Italian families did what many parents have wanted to do for years: they sued Facebook, Instagram, and TikTok for failing to protect their children.
The lawsuit, filed in a Milan court on October 7, 2025, accuses Meta (Facebook and Instagram) and TikTok of deliberately ignoring age restrictions and using addictive features that harm children’s mental health. But this isn’t just another lawsuit – it’s the first major European legal action to challenge both how platforms verify children’s ages AND how their algorithms target young minds.
What Happened
A coalition of Italian families took legal action against three of the world’s biggest social media platforms, claiming they’ve created a system designed to attract children while pretending to keep them out.
The lawsuit asks the Milan court to require these platforms to:
- Implement stronger age-verification systems for users under 14
- Remove manipulative algorithms designed to keep children scrolling
- Comply with Italian law requiring parental consent for children under 14
The families estimate that more than three million of the 90 million Facebook, Instagram, and TikTok accounts in Italy belong to children under 14 – all using platforms that officially ban users that young.
Why Age Verification Matters (And Why It’s Failing)
Italian law is clear: children under 14 cannot use social media without parental consent. But the platforms’ current age verification system is essentially “just trust us” – a birth date entered during signup with no verification.
The families argue that Meta and TikTok know this system doesn’t work. They know millions of underage children are using their platforms. And they continue to serve those children the same addictive algorithmic feeds designed for adults.
Here’s what that looks like in practice:
- A 12-year-old creates an account claiming to be 18
- The platform accepts this without question
- The algorithm immediately begins learning the child’s preferences
- Within minutes, the feed is personalized to keep that child scrolling
- The platform profits from the child’s attention and data
The lawsuit claims this isn’t a bug – it’s the business model.
The Algorithm Problem
This lawsuit goes beyond just age verification. It challenges the fundamental way these platforms work: algorithmic feeds designed to maximize engagement.
What are algorithmic feeds? Instead of showing posts in chronological order from people you follow, these feeds use artificial intelligence to predict what will keep you scrolling. They track every click, every pause, every interaction – then serve more of what keeps you engaged longest.
For children, this creates several problems:
- Endless scrolling: The feed never ends, making it hard to put the device down
- Emotional manipulation: Algorithms learn to serve content that triggers strong emotions
- Echo chambers: Children see increasingly extreme versions of topics they’ve shown interest in
- Mental health impacts: Studies link heavy social media use to increased anxiety and depression in children
The Italian families argue that serving these addictive feeds to children – especially children the platforms know shouldn’t be there – constitutes harm.
What Other Countries Are Watching
Italy isn’t alone in questioning how social media platforms treat children. This lawsuit comes as:
- Australia moves forward with a social media ban for children under 16
- The UK implements the Online Safety Act with strict child protection measures
- Multiple US states pass laws restricting algorithmic feeds for minors
- The EU enforces the Digital Services Act with enhanced child safety provisions
If the Italian families succeed, it could set a precedent for how European courts view platform responsibility for child safety. Other countries are watching closely – similar parent-led lawsuits are being considered across Europe.
What This Means for Your Family
Even if you’re not in Italy, this lawsuit reveals truths about how these platforms work everywhere:
The platforms know children are using their services: Despite age restrictions, Meta and TikTok are aware that millions of underage users access their platforms daily. The lawsuit argues they profit from this knowledge rather than fix it.
Age verification is deliberately weak: Current systems are designed to be easy to bypass. Platforms could implement stronger verification (like requiring ID or parental consent) but choose not to because it would reduce their user base.
Algorithms don’t care about age: Once a child lies about their age, they’re treated like an adult user – served the same addictive feeds, the same targeted content, the same algorithm designed to maximize engagement.
Your child’s “age” on the platform might be wrong: Many children created accounts years ago claiming to be older. Even if they’re now the “correct” age for the platform, their profile still shows them as an adult, bypassing safety features.
What Parents Can Do
While this lawsuit moves through the courts, parents can take action now:
Check your child’s account age:
- Go into account settings and verify the birth date listed
- If it’s wrong, update it (though this may limit features)
- Understand that teen accounts have different protections than adult accounts
Understand the algorithm is always watching:
- Every like, comment, and pause teaches the algorithm
- Brief interactions with concerning content can trigger more of it
- The “For You” or “Recommended” feed is where the algorithm works
Use available parental controls:
- Meta’s Family Center and TikTok’s Family Pairing offer some oversight
- Be aware these tools have limitations (as our recent report showed)
- They’re a starting point, not a complete solution
Have honest conversations:
- Ask your child what they see in their feed
- Scroll through together (without judgment)
- Talk about how algorithms work and why platforms want them scrolling
Consider whether younger children should use these platforms at all:
- The lawsuit highlights that these platforms aren’t designed for children
- Even with protections, the core product is built to be addictive
- Delaying access might be the safest choice for younger kids
The Bigger Picture
This lawsuit is part of a growing global movement of parents who refuse to accept “that’s just how social media works” as an answer.
For years, platforms have argued they’re doing enough to protect children while simultaneously designing products to maximize engagement regardless of age. The Italian families are challenging that contradiction head-on.
Whether they win or lose, this case sends a clear message: parents are no longer willing to let tech platforms experiment on their children in the name of profit.
The outcome could reshape how social media works for children worldwide.
Related Reading
- Kentucky Sues Roblox Over Child Predators and Safety Failures
- Australia Issues Urgent Warning: 22% of Kids Exposed to Graphic Violence Online
- Instagram Parental Controls Don’t Work: New Report Exposes Failures
What are your thoughts on this lawsuit? Should platforms be held responsible for enforcing their own age restrictions? Let us know in the comments below.
Want to stay informed about child safety developments?
Get Plugged In – what every parent in today’s digital world needs to know. Delivered free every Thursday.
www.wired-parents.com/subscribe
Sources: Reuters



