Who should verify your child’s age — the app, or the phone?

There is a fight unfolding in US courts, statehouses and Congress about a question parents have not been asked but will be affected by. Should social media apps verify how old your child is? Or should the phone do it once, at the moment your child sets up the device, and pass that information to every app they download?

It determines whose job it is to keep children off platforms not designed for them. It determines whose privacy is at risk when verification goes wrong. And it determines what parents will have to do every time their child wants to use a new app.

What is being proposed

The two approaches are:

App-level verification is the system we have now. Each app asks a child their age when they sign up. The child enters whatever they want. The app trusts them, or applies modest checks like asking for ID or a video selfie. Every new app means another check, or another lie.

App-store-level verification moves the check to the start. When a parent or child sets up a new iPhone, iPad or Android device, they declare an age range. The operating system stores it. When the child tries to download an app — Instagram, TikTok, Roblox, Snapchat, anything — the app store passes the age signal to the app. The app then knows the user is, say, 13 to 15, and applies appropriate restrictions automatically. The child does not get to lie. There is no birthday box to fill in.

This is the approach Meta has been publicly pushing since 2025. It is also the approach behind a wave of US state laws and a federal bill that was introduced in April 2026.

Where the laws stand right now

Texas passed the App Store Accountability Act in May 2025. It would have required Apple and Google to verify the age of every Texan downloading apps and obtain parental consent for under-18s. It was due to take effect on 1 January 2026. On 23 December 2025, a federal judge issued a preliminary injunction blocking the law on First Amendment grounds. The judge described it as the equivalent of requiring every bookstore to verify the age of every customer at the door. The Texas attorney general has appealed.

Utah passed a similar law in March 2025 and was due to be the first state to bring it into force on 6 May 2026. In March 2026, Utah amended its own law before it could take effect, removing the attorney general’s enforcement power and pushing the operational deadline to 6 May 2027. The trade association suing Utah voluntarily withdrew its case on 21 April 2026.

Louisiana‘s App Store Accountability Act takes effect on 1 July 2026.

Alabama passed a similar law that takes effect on 1 January 2027.

California signed a markedly different version in October 2025, called the Digital Age Assurance Act. It takes effect on 1 January 2027. It does not require parental consent for downloads. It requires the operating system to collect an age range at device setup and pass it as a signal to apps, which then apply their own existing legal obligations. Big Tech supported it. The other states’ laws are being challenged in court.

At the federal level, two bills are live. The App Store Accountability Act, sponsored by Senator Mike Lee and Representative John James, would create a national app-store-verification regime. The Parents Decide Act, introduced on 13 April 2026 by Representatives Josh Gottheimer and Elise Stefanik, would go further — requiring age verification at the operating system level for every device sold in the country, including for adults. Both bills are in committee.

Why this is a fight, not a consensus

Apple and Google do not want this responsibility.

Apple’s CEO, Tim Cook, met with the House Energy and Commerce Committee in December 2025 to lobby against the federal App Store Accountability Act. The company’s privacy head, Hilary Ware, wrote to the committee arguing that the law could threaten the privacy of every adult app store user by forcing them to surrender ID for the simple act of downloading an app. Apple’s analogy: the App Store is a mall, not a liquor store. Most apps do not have age requirements. Asking everyone to prove their age at the entrance is excessive.

Apple has also taken pre-emptive action. It launched the Declared Age Range API on iOS 26, iPadOS 26 and macOS 26, available worldwide. The system lets parents set an age range during device setup, and apps can request that signal — but only if the parent allows. Apple frames this as a privacy-first compromise. Meta and others say it does not go far enough.

Meta is on the other side. It owns Facebook, Instagram, WhatsApp, Threads. It is the company most exposed to legal action from parents whose children are on platforms that were not safe for them. Meta has lobbied actively for app-store-level verification and helped found the Coalition for a Competitive Mobile Experience in April 2025 with Spotify and Match Group, with this as a central campaign. Its analogy: the App Store is the liquor store, and the liquor store checks IDs.

Both analogies tell you something true about the speakers. Apple does not want the regulatory burden of being the ID checker for hundreds of millions of users. Meta does not want to be liable when an 11-year-old gets onto Instagram by lying.

The privacy advocates are split. Some, like the Electronic Frontier Foundation, warn that operating-system-level verification creates surveillance infrastructure for every adult in the country, not just children. Others argue the current system — where users hand over ID and selfies to dozens of separate apps — is already worse for privacy, and centralising verification at the operating system level reduces total exposure.

What this means for you

The first thing to know is that this fight will affect you, even if your child is not on any of these platforms yet. If the federal Parents Decide Act passes, every adult in the United States will need to verify their age when setting up a new device. If state laws like Louisiana’s take effect without being struck down, your child will not be able to download a new app without your consent flowing through the app store. The infrastructure is being built around you.

The second thing to know is that the existing setup is already changing. Apple’s Declared Age Range API is already live on every iPhone running iOS 26 or later. If you have a child with their own device, you can already set their age range so apps that support the API receive an accurate signal. It is opt-in. Most parents have not been told it exists. Take five minutes this week to look at your child’s device settings under Screen Time and Family Sharing — the option is there if your child is signed into a Child Account through Family Sharing.

The third thing to know is that no system removes the need for the conversation. Even if app-store-level verification becomes universal, your child will still find ways around it. They will use older siblings’ devices, friends’ devices, or borrowed adult accounts. The technology is moving forward, but it is not a substitute for the slower, harder work of talking to your child about why some platforms are not for them yet.

What changes is who is responsible when the system fails. Right now, it is the apps. If the app-store model wins, it will be Apple and Google. If neither model wins fully, it will keep being the parents.

Sources: Computer & Communications Industry Association v. Paxton — preliminary injunction order, Western District of Texas, 23 December 2025

Utah HB 498 — App Store Accountability Act Amendments, signed 18 March 2026

California AB 1043 — Digital Age Assurance Act, signed 13 October 2025

H.R. 8250 — Parents Decide Act, introduced 13 April 2026

H.R. 3149 — App Store Accountability Act, federal companion bill

Apple Developer — Declared Age Range API documentation

Related Articles

Tags:

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

FTC Investigating AI Chatbots Over Child Safety Concerns

FTC launches inquiry into AI companion chatbots from Meta, OpenAI, Character.AI and others. What parents need to know about chatbot risks for children.

What “Learning How to Learn” Actually Means for Your Child

Google's AI chief says 'learning how to learn' will be critical for the future—but what does this buzzword actually mean?

Why Some Australian Teens Are Actually Happy About the Social Media Ban

Not all Australian teens are fighting the social media ban. Some are quietly relieved.