On January 1, 2026, Virginia became the first US state to impose daily time limits on children’s social media use. Under a new law, platforms like Instagram, TikTok, Snapchat and YouTube must automatically restrict users under 16 to just one hour per day—unless parents step in to adjust those limits.
It’s a different approach from the outright bans grabbing headlines in Australia and Europe. Rather than blocking children entirely, Virginia is asking: what if we just turned down the volume?
How the Law Works
The law amends Virginia’s Consumer Data Protection Act and places the burden squarely on social media companies. Here’s what they’re required to do:
Age verification: Platforms must use “commercially reasonable methods” to determine if users are under 16. This includes neutral age screening tools, and crucially, if a user’s device signals they’re a minor through browser settings or parental control software, platforms must treat them as such.
Default one-hour limit: Once identified as under-16, users are automatically capped at 60 minutes per day, per platform. That means an hour on Instagram, an hour on TikTok, an hour on YouTube—the limits don’t aggregate across apps.
Parental override: Parents can grant “verifiable consent” to increase or decrease the daily limit. The law doesn’t specify exactly how platforms must verify this consent, but it’s clear parents retain final say.
No retaliation: Platforms can’t degrade service quality, increase prices, or withhold features from users subject to the time limit.
What’s Covered (and What Isn’t)
The law targets social media platforms where users create profiles, connect with others, and view content feeds. It specifically covers major apps like Facebook, Instagram, TikTok, Snapchat, Reddit, Pinterest, X (formerly Twitter), and YouTube.
It does not apply to messaging or email services. WhatsApp, Signal, Gmail and similar communication tools remain unrestricted, recognising that families rely on these for basic coordination.
Why Virginia Took This Approach
Virginia State Senator Schuyler VanValkenburg, who co-sponsored the bill, framed the law as addressing “addiction” to social media. “When we get hooked on social media, it’s almost like a drug addiction, you just have to have more,” he told local media.
The approach reflects growing concern about what happens when children spend hours each day on platforms designed—through notifications, infinite scroll, and algorithmic recommendations—to maximise engagement. VanValkenburg referenced the US Surgeon General’s 2023 advisory, which found adolescents spending more than three hours daily on social media faced double the risk of depression and anxiety symptoms.
But rather than banning access entirely, Virginia opted for what supporters call “guardrails.” The thinking: let parents decide what’s right for their children, but give them better tools to enforce boundaries.
It’s worth noting what didn’t make it into the final law. An earlier version would have banned “addictive algorithmic feeds” for all users under 18 and extended time limits to that age group. Those provisions were removed after pushback, helping the bill gain broader support.
The Enforcement Question
The law falls under Virginia’s Consumer Protection Act, giving the Attorney General authority to impose civil penalties. But how this actually works in practice remains unclear.
Some platforms already offer screen time features—Instagram’s “Take a Break” reminders, TikTok’s 60-minute daily limits. But those are opt-in. Virginia makes time limits mandatory and default.
Key questions: Will platforms build Virginia-specific features or extend limits nationally? Will age verification actually work, or will teenagers lie about birthdates? Senator VanValkenburg acknowledged imperfect compliance is expected: “we don’t want perfect to be the enemy of the good.”
The Legal Challenge
The law is already facing a constitutional test. NetChoice, a trade association representing tech companies including Meta and Google, filed suit in December claiming the time limits violate minors’ First Amendment rights.
“The Constitution leaves the power to decide what speech is appropriate for minors where it belongs: with their parents,” the lawsuit argues.
It’s a familiar argument. NetChoice has successfully challenged social media laws in Arkansas, California, Florida, Georgia, Louisiana, Mississippi, Ohio, Tennessee, Texas and Utah on similar First Amendment grounds.
But Virginia may have learned from those defeats. The Supreme Court’s June 2024 decision in Free Speech Coalition v. Paxton upheld a Texas law requiring age verification for adult websites, finding it constitutional under intermediate scrutiny. That precedent could help Virginia’s law survive—though critics argue there’s a difference between restricting access to pornography versus mainstream social platforms.
What Parents Are Saying
Virginia families have mixed reactions.
Megan Cappella supports limiting social media but was shocked to discover her 13-year-old’s total daily screen time—across all apps—exceeded 11 hours. Suddenly a one-hour social media limit seemed less draconian.
Neil Goldsmith questions whether government mandates are the answer: “If Alex was 15, he’d still have my permission,” emphasising that parental involvement matters more than arbitrary limits.
Others worry about legitimate uses: teenagers using social media for school projects, creative work, or staying connected with long-distance family and friends.
A Middle Path—Or a False Compromise?
Virginia’s approach represents a middle ground between doing nothing and imposing outright bans. It acknowledges parents’ concerns about excessive screen time while preserving some degree of access and parental autonomy.
But it also raises fundamental questions about who should decide how children spend their time online. Is one hour too restrictive, or too permissive? Should these decisions be made by state legislatures, tech companies, or families?
Critics like Amelia Vance from the Public Interest Privacy Center argue that usage limits miss the point entirely. “The approach that is most likely going to be effective here is the one that says, ‘OK, kids are going to be online. Let’s give them special protections when they are,'” she told Education Week.
In other words: rather than controlling how much time children spend on platforms, perhaps we should be regulating what platforms do with children’s data, how they design addictive features, and how they moderate harmful content.
What Happens Next
Virginia’s law is now in effect, though active enforcement likely won’t begin until legal challenges are resolved. Other states are watching closely.
If the law survives court scrutiny, it could become a template for similar legislation nationwide. If it’s struck down, states may need to find yet another approach to addressing their concerns about children and social media.
For now, Virginia has done what no other US state has managed: actually implement a law that changes how social media platforms operate for children. Whether it works—whether it reduces harm, whether families find it helpful, whether platforms comply—remains to be seen.
One thing is certain: the debate over children’s social media use isn’t ending. It’s just entering a new phase, one hour at a time.
What do you think? Is Virginia’s one-hour limit the right balance between protection and autonomy? Or does it miss the point entirely? The conversation is just beginning.
Sources:
- Virginia Code § 59.1-577.1 (Full Text) – Official Virginia law text
- NetChoice v. Miyares lawsuit page – NetChoice official case information
- NetChoice Complaint (PDF) – Full legal filing (November 17, 2025)
- Pilot Online: Virginia social media limits lawsuit (November 18, 2025)
- Biometric Update: New Virginia law tests time limit approach (December 2025)
- Woods Rogers: Virginia amends data privacy law
- Hunton: Breaking down novel Va. social media law for minors
- WSET: How Virginia’s new social media limits could impact…



