- Solicitors For Business
- Solicitors For You
- Armed Forces Claims
- Clinical Negligence
- Court of Protection
- Criminal Defence
- Driving Offences
- Family Law Solicitors
- Media Law
- Personal Injury
- Personal Immigration Services
- Personal Insolvency
- Professional Regulation and Discipline
- Residential Real Estate
- Wills, Trusts & Estate Planning
- Will Disputes Solicitors
- About Us
- News & Events
The Online Safety Bill: Who invited the Thought Police?12th October 2021 Business Crime
The internet increasingly impacts upon almost every aspect of our everyday lives, and with the difficulties faced throughout the pandemic, the internet and social media has allowed people to remain connected even in the harshest lockdown. However, with ever more time being spent online, the topic of online safety has embedded itself firmly into our society’s dialogue – and rightly so.
The response? On 12 May 2021, the Government published the Online Safety Bill (OSB), which is a significant piece of legislation that introduces a new regulatory framework to tackle the presence of illegal and harmful content online.
What is the OSB?
The OSB imposes a duty of care on social media companies and other ‘user-to-user’ services to remove illegal and harmful material, with explicit focus on offences such as terrorism, child sexual exploitation and abuse, and content that incites racial hatred. Importantly, content that does not satisfy the threshold for criminality, but is nonetheless deemed to be harmful to adults and children, still falls within the scope of the bill.
At first glance, the term ‘harmful’ is a somewhat vague, subjective term. The bill defines harmful content as that which the provider has ‘reasonable grounds to believe would give rise to a material risk of significant adverse physical or psychological impact on a child or adult of ordinary sensibilities.
The OSB confers powers of regulation to The Office of Communications, more commonly known as ‘Ofcom’, to police illegal and harmful material online. Under the OSB, Ofcom will be given the authority to block access to sites that they deem to display illegal and harmful material, and to impose strong penalties on offending companies of up to £18 million, or 10% of annual global revenue, whichever value is greater.
Even those of us that can’t call ourselves die-hard football fans could not help but get behind the Three Lions during the final of the Euros 2020 this year. However, despite the team making history, the swell of hateful and racist abuse, following their unfortunate defeat during the penalty shootout against Italy, drew a dark cloud over the team’s achievement.
The UK Football Policing Unit said recently that they had received 600 reports of racist comments aimed at England's black players after the defeat, of which 207 were deemed to be criminal in nature. The investigations showed the problem to be an international one, with 123 messaged posted from overseas accounts, while 34 came from the UK leading to 11 arrests across the country.
However, the problem isn’t new, with Marcus Rashford continually drawing attention to the online racial abuse that he has found himself victim to, and sadly, he is not the only one.
Unfortunately, a high proportion both adults and children have found themselves exposed to some kind of illegal and harmful content online, with around 62% of adult internet users and 80% of children ages 12-15 internet users having had potentially damaging experiences as a result.
The effect of the Covid-19 pandemic has also appeared to have created an epidemic of online abuse, according to Glitch, a UK charity targeted at ending online abuse. Their study indicates that 46% of respondents reported experiencing online abuse since the beginning of Covid-19, and of the respondents that had experiences online abuse in the 12 months prior to the pandemic, 29% reported it being worse during Covid-19, with much of the abuse taking place on social media platforms.
The spectre of censorship
These examples provide but a snapshot of the type of illegal and harmful online content this bill aims to tackle. However, the arrival of the OSB has not been met without concern. A number of organisations, including award winning human rights group Index on Censorship, have been outspoken in their opposition of the OSB. The IoC recently published a report, which examined the potentially damaging impact of the bill on free speech and press freedom. This has been followed by a group of MPs, lawyers and campaigners including David Davis MP, Gavin Millar QC and the chief executive of the IoC Ruth Smeeth, launching the ‘Legal to Say, Legal to Type’ campaign, to scrutinise and push back publication of the bill.
Whilst members from these groups welcome the action to tackle the problem of illegal and harmful content, they raise significant concerns regarding the legal and practical impact of the duty of care imposed by the bill.
Indeed, one such concern is that legal content deemed to be harmful will not be adjudicated by an impartial judicial process but rather by the quasi-governmental authority Ofcom, with many fearing that this will result in them becoming what will be, in effect, a ‘super regulator’ of free speech. This raises the spectre of Orwell’s 1984, where the Thought Police were employed by the totalitarian state to persecute individuality and independent thinking.
Campaigners have also raised concerns regarding the potential commercial incentive for many social media companies and ‘user-to-user’ platforms to remove more content than necessary, in order to avoid the possibility of huge fines. The question posed is whether the OSB is the first step into a future of state censorship in the UK?
Illegal and harmful content has plagued the internet and our online experience for far too long and, on the face of it, the OSB is fighting back against this. However, whatever the stated intention of the legislation, great care must obviously be taken to strike a balance between securing our online safety and preserving our fundamental civil liberties.
The criminal law as it stands provides a robust framework for the prosecution of individuals who post racist or illegal content, rather than shifting the blame onto the web companies. The issue is sometimes one of resources rather than the police not having the legal tools to deal with offenders. It may be that greater policing and higher numbers of prosecutions in this area is needed alongside a role for Ofcom in the regulation of online content.
At JMW we are well placed to investigate and privately prosecute racist trolls and other online offenders. To find out more about how JMW can assist victims of online abuse, watch this video.
This blog was co-authored by Daniel Martin and Jade Lam-Richardson.