Bluesky CEO: Age Verification Laws Entrench Big Tech Dominance

As governments worldwide implement age restrictions for social media, Bluesky CEO Jay Graber is raising concerns about an unintended consequence: the regulations designed to protect children from harmful platforms might eliminate the competition needed to create better alternatives.

The Argument

Graber’s concern centres on compliance costs. When regulations require age verification systems, content filtering, parental control dashboards, and extensive legal documentation, those costs don’t scale with user numbers.

Meta employs thousands across trust and safety, legal compliance, and policy teams. Their annual revenue exceeds $130 billion. Implementing age verification, even if it costs tens of millions, is expensive but manageable.

Bluesky operates with dozens of staff and runs on venture funding with no revenue yet. The same compliance requirements that Meta can absorb would potentially make it impossible for Bluesky to operate at all.

Fixed Costs vs. Scalable Costs

The problem is the nature of regulatory compliance costs.

Scalable costs grow with your user base. Servers, bandwidth, customer support—these scale up as you get more users and generate more revenue. Startups can manage these because they start small and grow gradually.

Fixed costs don’t scale. Whether you have 10,000 users or 10 million, you need:

  • Legal teams to interpret regulations across multiple jurisdictions
  • Technical infrastructure for age verification systems
  • Trust and safety staff to handle compliance
  • Documentation proving you’re following the rules
  • Systems for enforcement when verification fails

Meta spreads these fixed costs across 3 billion users. A startup with 500,000 users cannot.

The Competition Problem

Graber argues that competition drove Meta to improve. For years, users and regulators requested chronological feeds. Meta ignored them. Then users started migrating to competitors. Suddenly, Meta added chronological feeds.

“If only giants can afford to operate,” Graber said, “competitive pressure disappears.”

When users can’t switch to alternatives, platforms have less incentive to respond to concerns. If every new entrant faces compliance costs that make operation impossible, users stay on the platforms causing the problems that regulations are trying to solve.

The Irony

We’re implementing safety regulations specifically to protect children from platforms that have been optimised purely for engagement regardless of harm.

But if compliance costs make alternatives impossible, children remain stuck on exactly those platforms.

The regulations might make Meta slightly safer while ensuring Meta faces no meaningful competition.

The Counter-Argument

Critics of Graber’s position argue:

Child safety is more important than startup economics. If compliance costs are the price of protecting children, that’s an acceptable tradeoff.

Startups can raise money for compliance. If an alternative platform is genuinely better, investors will fund the compliance costs required to operate.

Large platforms caused the problem. It’s appropriate that regulations favour established players who can absorb costs, because those players created the harms being addressed.

Tiered compliance could work. Regulations could require less from platforms under certain size thresholds, allowing startups to grow before facing full compliance burdens.

What This Means in Practice

Australia’s under-16 ban is already being implemented. France is fast-tracking an under-15 ban for September. The UK House of Lords voted for age verification within one year. The U.S. Congress is advancing KOSMA.

These aren’t theoretical regulations. They’re becoming reality across multiple jurisdictions simultaneously.

For platforms, this means:

  • Large platforms will comply. Meta, TikTok, Snap, and YouTube can absorb the costs. They’ll implement age verification, remove accounts, and continue operating.
  • Small platforms face difficult choices. Comply (and potentially become financially unviable), exit markets with strict regulations, or fight the laws in court.
  • New entrants will be rare. Starting a social platform in 2026 means either having massive funding for compliance or avoiding markets with age restrictions entirely.

The Policy Question

Graber’s argument raises an uncomfortable question for policymakers.

If the goal is protecting children, and if competition drives platform improvement, does creating regulations that eliminate competition actually serve that goal?

Or does it lock in the dominance of the exact platforms we’re trying to regulate?

There’s no easy answer. Protecting children is urgent. But if the protection mechanisms ensure children remain on platforms optimised for engagement over wellbeing (because no alternatives can afford to exist), have we actually protected them?

What Parents Should Know

This debate matters for your family because it affects what choices you’ll have.

If Bluesky is right about compliance costs eliminating competition, your children will have fewer platform alternatives. The choice won’t be “which platform is safest” but “do we use Meta/TikTok/Snap or nothing.”

If regulators are right that safety requires strict compliance regardless of competition effects, your children will have better protection on the platforms that survive, even if there are fewer of them.

The tradeoff is real: comprehensive protection that favours incumbents vs. lighter regulation that permits competition.


Related stories:

Sources:

Related Articles

Top Comments

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST

Digital Wellbeing

Smartphone Effects on Children’s Brains by Age

The impact of devices on the brains of infants, children and adolescents.

How To Stop Brain Rot By Age Group

Practical tips for parents to help your children avoid or minimise "brain rot" from overconsuming low-quality online content.

🛡️ UK’s New Online Safety Rules Go Live: A Landmark Moment for Child Protection

New online requirements in the UK to protect children

Teen Stroke from Phone Use: What Parents Need to Know About ‘Text Neck’ Risks

A Chinese teenager's stroke from 'text neck' made global headlines, but leading spinal researchers call it 'a buzzword' rather than a real medical condition.

IYKYK: The Teen Texting Codes Every Parent Should Know

Parents may feel fluent in “LOL” and “BRB,” but today’s teens are using a new wave of texting codes.