Governments worldwide are rushing to regulate social media for children. Australia just passed a world-first ban for under-16s. Malaysia announced similar plans. Platforms like Roblox are implementing facial scanning.
A new study from the Center for Democracy and Technology asked 45 parents and teenagers what they want from social media safety features. The findings reveal a gap between what policymakers are mandating and what these families say they prefer.
But there’s a bigger question: if parents already have access to parental controls, why aren’t most of them using them?
What the 45 Families in the Study Said
Age Verification: The surveyed parents and teens viewed ID-based verification and facial scanning as “invasive and unreliable.” They preferred parent-centred approaches where parents declare their child’s age and consent to downloads.
Screen-Time Limits: Both groups valued reminders and parent-led limits but opposed app-enforced restrictions. Strict curfews were “too controlling” and “impractical.”
Algorithmic Feeds: Teenagers surveyed trusted algorithmic feeds and showed little interest in chronological alternatives. They preferred simple “not interested” buttons.
The Key Finding: Participants emphasised flexibility and respect for family dynamics. “Rigid, one-size-fits-all policies risk creating resistance, workarounds, or harm.”
But Here’s What the Research Doesn’t Address
The findings assume parents are engaged, informed, and actively managing their children’s technology use.
The evidence suggests otherwise.
Most parents already have access to parental controls. iPhone Screen Time has been available since 2018. Android Digital Wellbeing launched the same year. Every major platform offers parental oversight tools.
Yet parents aren’t using them:
- Parents approve “YouTube” without realising YouTube Shorts is hidden inside
- They have no idea their child spends three hours daily on Roblox
- They don’t know what Discord is or how Snapchat streaks work
- Many simply trust their child or don’t have time to investigate
- Their 12-year-old is often more tech-savvy than they are
When Roblox announced parental linking features over a year ago, most parents whose children use the platform had never heard of it.
The awareness gap is enormous. A parent might know their child uses social media. But do they know which apps specifically? How many hours per day? What content they’re consuming? Who they’re talking to?
For most families, the answer is no.
The CDT research asked 45 parents what controls they want. But it didn’t ask whether they’re currently using the controls they already have.
If parents aren’t using existing flexible tools, why would giving them more flexible tools solve the problem?
Why Governments Are Using Blunt Instruments
This explains why governments are reaching for rigid, blanket solutions.
Australia’s under-16 ban isn’t elegant or nuanced. But it also doesn’t require parents to understand age verification, know which apps their children use, set up parental controls, or stay informed about new platforms.
The ban simply says: platforms must keep under-16s off. Full stop.
Is it ideal? No. Does it respect family autonomy? No. But does it address the reality that most parents aren’t actively managing their children’s social media use? Yes.
Policymakers face a choice: Trust parents to use flexible tools (which evidence suggests they won’t) or mandate rigid rules that don’t require parental engagement.
They’re choosing option two because option one demonstrably isn’t working.
The Uncomfortable Truth
The CDT research reveals an important tension:
What the 45 surveyed parents said they want: Flexible, family-centred controls that respect their judgment.
What most parents actually do: Often nothing, because they’re overwhelmed, uninformed, or unaware of what’s happening on their child’s devices.
Both can be true. The parents surveyed may genuinely prefer flexibility. But broader evidence shows most parents aren’t managing their children’s technology use effectively.
This creates an impossible situation for policymakers. If they provide flexible tools (which already exist), most parents don’t use them. If they mandate rigid rules, the engaged parents—like those who participated in this research—complain about loss of autonomy.
Perhaps the Real Question Is Different
Instead of debating flexible tools versus blanket bans, maybe we need to ask: how do we close the awareness gap?
What if the solution includes:
- Default protections that work without parental action (but can be adjusted by engaged parents)
- Mandatory digital literacy in schools for both children and parents
- Platform responsibility to surface problems to parents proactively
- Simple, unmissable controls rather than buried settings menus
- Regular check-ins that require parents to review what’s happening
This wouldn’t eliminate the need for regulation. But it might create conditions where family-centred approaches could actually work—because families would actually be aware and involved.
What This Means for Parents
If you’re reading this, you’re already more engaged than most. But ask yourself honestly:
- Do you know every app your child uses?
- Have you set up parental controls on all their devices?
- Do you review what they’re doing online regularly?
- Could you explain how YouTube Shorts differs from regular YouTube?
If the answer to any of these is no, you’re not alone. Most parents can’t answer these questions.
That’s exactly why governments are stepping in with blunt instruments. Because waiting for all parents to become informed and engaged isn’t a strategy—it’s hope.
The CDT research is valuable because it shows what a small group of engaged families want. But it doesn’t address how to engage the vast majority of families who aren’t paying attention.
Until we solve that problem, expect more blanket bans, mandatory age verification, and rigid rules that override parental judgment.
Not because policymakers don’t value family autonomy. But because family-centred approaches only work when families are actually involved—and most aren’t.
Source: Center for Democracy and Technology, “What Kids and Parents Want: Policy Insights for Social Media Safety Features,” November 19, 2025.



