A new era for internet safety? The Online Safety Act 2023

Call 0345 872 6666


A new era for internet safety? The Online Safety Act 2023

After much debate and discussion, the Online Safety Act (the Act) received Royal Assent on 26 October 2023. The Government has described the legislation as “heralding a new era of internet safety and choice by placing world-first legal duties on social media platforms.”

The regulation of those legal duties will be the role of the media and telecommunications regulator; OFCOM. OFCOM has published its ‘Ofcom’s approach to implementing the Online Safety Act’ document. That document sets out its timetable for implementation of the Act, although that timetable may change depending on when the next General Election will be.

The Act makes companies responsible for removing harmful online content. OFCOM’s role as the regulator will be to produce various codes of conduct to which companies, such as social media platforms, will be required to adhere.

OFCOM will enforce the codes, and the risk to tech companies are potentially severe. OFCOM can issue fines of up to £18 million or 10% of their global annual revenue, whichever is bigger. The Government has said that fines could potentially reach billions of pounds. OFCOM is no stranger to handing out large fines, it has previously fined BT £42 million for failing to pay compensation to competitors when ethernet lines were not available. BT / Openreach did subsequently apologise.

OFCOM plans to implement the Act through three phases; phase one will begin almost immediately.

Phase one: illegal content

On 09 November 2023, OFCOM will be publishing its first consultation, which will address illegal harms. The initial focus will be on codes and guidance relating to dangers online such as terrorism or fraud, as well as protecting children from sexual exploitation and abuse. The Act requires OFCOM to publish their Codes of Practice on illegal harms to the Secretary of State and associated guidance within 18 months of Royal Assent.

Phase two: Child safety, pornography and protecting women and girls

OFCOM will then turn its attention to protecting children online including protecting children from pornography, content relating to suicide, self-harm and eating disorders, abusive content that is targeted at or incites hatred, bullying and content that depicts serious violence.

Assuming that Parliament accepts OFCOM’s codes, OFCOM hopes to investigate and impose sanctions in Summer 2025.

Phase three: Additional duties for categorised services

Phase three of OFCOM’s timetable will focus on safety regulation towards ‘categorised services’ which includes transparency reporting, user empowerment, fraudulent advertising and user rights. OFCOM plans to publish a register of categorised services by the end of 2024; they expect the final code to be published by the end of 2025.

What does it mean for tech companies

Tech companies will be responsible for removing harmful online content and to ensure that they comply with OFCOM’s various codes of practice or they could face significant penalties hopefully keeping all of us, especially children safe online. The Act is considered to be one of the biggest changes in online safety, and placing much greater obligations on platform providers.

Did you find this post interesting? Share it on:

Related Posts