Roblox Now Requires Facial Scans to Chat

If your child plays Roblox, they’ve likely been prompted to scan their face or upload an ID to continue chatting with friends. Starting January 7, 2026, Roblox became the first major gaming platform to require age verification for all users to access chat features. The system uses facial recognition to estimate age and groups users into age brackets to prevent adults messaging children.

The policy addresses real problems. At least 30 people have been arrested since 2018 in the United States for grooming children they met on Roblox. But the solution raises questions about privacy, effectiveness, and whether age-segregated chat rooms make children safer or create what critics call “a menu of age ranges” for predators.

Whether you allow your child to complete the verification or disable chat entirely is your decision. Here’s what you need to know to make it.


What Happened

Mandatory Facial Verification Rolls Out Globally: Roblox announced on January 7, 2026 that all users worldwide must complete an age check to access chat features. The requirement began in Australia, New Zealand, and the Netherlands on December 1, 2025, and expanded to the United States and remaining regions throughout the first week of January.

Users have two verification options:

  • Facial age estimation: Use the Roblox app camera to scan their face. The system estimates age and assigns them to an age group.
  • ID verification: Upload a government-issued ID to confirm exact age (available only for users 13+)

Matt Kaufman, Roblox Chief Safety Officer, said: “By building proactive, age-based barriers, we can empower users to create and connect in ways that are both safe and appropriate.”

How Facial Verification Works: Users open the Roblox app camera and move their head left and right. The footage is processed by Persona, a third-party vendor, which estimates age and assigns users to one of four groups: under 9, 9-12, 13-15, or 16+. Images are deleted immediately after processing.

Users can only chat with their age group and the groups immediately adjacent. For example, 9-12 year-olds can chat with under-9s, other 9-12s, and 13-15s. Adults (16+) cannot message children under 13 unless they’re “Trusted Connections.”

For children under 9, chat is disabled by default unless a parent verifies and grants permission.

Adoption Rates: In Australia, New Zealand, and the Netherlands, where the system has been mandatory since early December, over 50% of daily active users have completed verification. Roblox expects similar adoption globally.

Tens of millions of users have already completed the verification process. Users who don’t verify can still play Roblox games but cannot access any chat features.


Why Roblox Implemented This Now

Legal and regulatory pressure: Roblox faces at least 35 lawsuits alleging the platform enables child exploitation. Florida Attorney General issued criminal subpoenas in October 2025. Federal prosecutors sentenced a former teacher to life in prison for grooming children on Roblox in November 2025. At least six people were arrested in 2025 alone for grooming-related crimes.

New York Governor Kathy Hochul specifically named Roblox on January 5, 2026 when backing the Stop Online Predators Act. The facial verification rollout demonstrates action before regulators force more restrictive requirements.


What It Means for Your Family’s Decisions

If your child currently uses Roblox chat: Your child will be prompted to scan their face or upload an ID. You need to decide whether to allow it or disable chat entirely.

Questions to consider: Do you trust Persona processing your child’s facial images, even if deleted immediately? Does your child need Roblox chat, or would voice chat outside the platform work instead? Is your child mature enough to navigate age-appropriate chat?

If you’re uncomfortable with facial scans, disabling chat is an option. Your child can still play games, just not communicate through Roblox’s chat system.

If your child is under 9: You must verify yourself as an adult and grant explicit permission. This gives you control over whether children this young should chat with strangers in games.

Privacy and effectiveness concerns: Two issues stand out:

Accuracy: Facial age estimation can misclassify ages. If a 30-year-old is estimated as 13-15, they gain access to actual children. AI-generated images might bypass the system entirely.

False confidence: If parents believe facial verification makes Roblox completely safe, they may relax supervision. The system is imperfect, and vulnerabilities could expose children while parents assume protections are working.

Roblox maintains age verification is one component of multilayered safety, including content moderation and reporting tools. The company argues imperfect protection beats no protection.


What This Doesn’t Resolve

Voice chat: Roblox’s facial verification only affects text chat. Voice chat isn’t filtered or monitored in real-time. Users 13+ can enable it after age verification.

Off-platform grooming: Predators often use Roblox to make initial contact, then move to Discord, Snapchat, or WhatsApp where moderation is minimal. Once contact moves off-platform, Roblox has no control.

False confidence: If parents assume facial verification makes Roblox completely safe and relax supervision, children could face risks while parents believe protections are working.


What Happens Next

Other platforms may follow: If the policy shows measurable safety improvements without significant user exodus, expect Minecraft, Fortnite, and other gaming platforms to implement similar systems. If Roblox faces backlash or the system proves ineffective, others may avoid it.

Regulatory scrutiny continues: New York’s Stop Online Predators Act specifically targets Roblox. Florida’s investigation continues. If grooming incidents continue despite age verification, expect more restrictive regulations.

Parents can adjust settings: Roblox provides parental controls beyond facial verification. You can disable chat entirely, restrict it to friends only, disable voice chat, restrict games by maturity ratings, and monitor activity. The tools exist; the question is whether you know they’re available.


When a platform with over 100 million daily users and 40% child audience implements mandatory facial scans, that creates privacy and safety questions that don’t have clear answers. Whether facial age estimation actually protects children or just creates the appearance of protection while introducing new risks is something even experts disagree about. The prompt on your child’s screen asking them to scan their face requires a decision from you, and neither “yes” nor “no” is obviously correct.


Sources

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

How To Stop Brain Rot By Age Group

Practical tips for parents to help your children avoid or minimise "brain rot" from overconsuming low-quality online content.

🛡️ UK’s New Online Safety Rules Go Live: A Landmark Moment for Child Protection

New online requirements in the UK to protect children

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

IYKYK: The Teen Texting Codes Every Parent Should Know

Parents may feel fluent in “LOL” and “BRB,” but today’s teens are using a new wave of texting codes.