• Jul 25, 2025
  • 1 min read

UK Enforces Age Verification for Restricted Content Under Online Safety Act

From July 25, 2025, the UK’s media regulator Ofcom is enforcing the Online Safety Act 2023 for age-restricted and potentially harmful content.

Photo credit: NoMoreStock / Shutterstock.com

From July 25, 2025, the UK’s media regulator Ofcom is enforcing the Online Safety Act 2023, marking a major shift in the way age-restricted and potentially harmful online content is accessed. 

From now on, individuals cannot simply state they are over 18 to access restricted material in the UK. Instead, platforms must introduce “highly effective” age-assurance systems to prevent minors from accessing pornography or content that promotes self-harm, suicide, or eating disorder information.

The Act puts the onus on the online content providers themselves, including social media companies, search services, and adult content sites, to verify age by using approved age-assurance tools such as facial age estimation, credit card or bank checks, digital ID services, mobile operator authentication, or official photo ID scans. 

Ofcom has also launched an enforcement program to monitor compliance and may levy fines of up to £18 million (approx. $24 million) or 10% of global turnover, whichever is higher.

Major websites like Reddit, X, Discord, and Bluesky are now implementing these measures in the UK. Although the Act aims to strengthen child protection online, digital rights groups like Open Rights Group warn about privacy risks, including a lack of regulation for age-approval providers and potential data misuse.