What the Online Safety Act Means for Children
Starting 25 July 2025, a new set of child protection rules will apply to social media platforms, search engines, and gaming services as part of the Online Safety Act rollout.
Ofcom, the UK’s online safety regulator, says these rules aim to shield children from highly harmful content—including material related to suicide, self-harm, eating disorders and pornography.
The measures also target content promoting misogyny, violence, hate, abuse, cyberbullying, and risky online challenges.
Companies that want to continue operating in the UK must comply with over 40 specific safety actions. These include:
- Tweaking recommendation algorithms to keep harmful material out of kids’ feeds
- Using tougher age checks to verify if users are under 18
- Rapidly removing harmful content and helping children who come across it
- Appointing a dedicated executive responsible for child safety, and reviewing safety processes annually
Non-compliance can lead to fines of up to £18 million or 10% of global turnover, and in severe cases, criminal charges for company leaders.
Ofcom also has the power to seek court orders blocking access to non-compliant sites and apps in the UK.