Ofcom Ramps Up Online Safety Act Enforcement as Children's Protections Take Effect
The UK's online safety regulator Ofcom is intensifying its enforcement of the Online Safety Act in 2026, with new powers to fine tech giants up to 10% of their global revenue for failing to protect children and users from harmful content.
The Online Safety Act, which received Royal Assent in October 2023, is now entering its most consequential phase of implementation. Ofcom has confirmed it will pursue enforcement action against platforms that fail to meet their legal duties, with penalties of up to 18 million pounds or 10% of a company's qualifying worldwide revenue -- whichever is greater -- for non-compliance.
Key Milestones in 2026
A super-complaints regime came into force in February 2026, allowing eligible organisations to bring systemic online safety issues directly to Ofcom's attention. This mechanism is expected to accelerate regulatory action on widespread harms, particularly those affecting children and young people.
Ofcom is also due to publish its categorisation register in July 2026, which will assign online services to risk categories and trigger additional duties for the largest and most harmful platforms. Category 1 services -- the highest-risk tier -- will face requirements including user identity verification and enhanced content moderation.
Children's Safety at the Centre
Protecting children from harmful content remains Ofcom's stated top priority. The regulator has published detailed codes of practice requiring platforms to implement age assurance measures, restrict children's access to harmful material, and report on how their services are used by under-18s. A statutory report on content harmful to children is expected in October 2026.
The Children's Commissioner for England has welcomed the new youth advisory board established to help shape online safety policy, ensuring young people's voices are heard in the regulatory process.
Industry Criticism and Challenges
Despite the progress, critics have raised concerns about the pace of change. The Online Safety Act Network has argued that Ofcom's implementation does not adequately address the safe by design objective, particularly regarding risks to children from livestreaming. The Internet Watch Foundation has also expressed concern that a technically feasible clause in the codes could allow platforms to avoid removing illegal content.
Smaller but high-risk platforms have also drawn scrutiny, with questions raised about whether the threshold conditions for categorisation will allow some dangerous services to escape the most stringent regulatory requirements.
What's Next
Ofcom plans to issue media literacy recommendations to in-scope providers in spring 2026 and will release an age assurance statutory report in summer 2026. The regulator has made clear that enforcement will be ongoing and that no platform -- regardless of size or origin -- is exempt from UK law if it has a significant number of UK users.
Full details of Ofcom's implementation roadmap are available at Ofcom's website.




