Regulation

How the Online Safety Bill threatens freedom of expression


SUGGESTED ARTICLES

Trade, Development, and Immigration
Tax and Fiscal Policy
The Online Safety Bill is an enormously complicated and hard-to-interpret piece of legislation. It aims to make the internet safer for users and hold the tech giants to account while protecting freedom of expression.

In the course of writing our IEA Briefing Paper on the Bill, published yesterday, my colleague Matthew Lesh and I found that explaining even the discrete sections of the Bill that we focused on was challenging.

The Bill imposes duties on user-to-user services (e.g., social media, web forums, online gaming, private messaging) and search engines to safeguard users – we refer to them in shorthand as ‘platforms’. The proposed duties on platforms include, among other things:

  • duties to undertake risk assessments for illegal content, to prevent or limit access to illegal content, to provide content reporting and complaints procedures, to protect freedom of expression and privacy, and to keep certain records;

  • for services likely to be accessed by children, duties to undertake children’s risk assessments and protect children’s online safety; and

  • for the largest, highest-risk user-to-user services, known as ‘Category 1’ services, duties to protect users from designated content that is harmful to adults but not illegal (defined as ‘priority content that is harmful to adults’, informally known as ‘legal but harmful’), to provide user empowerment tools, and to protect ‘content of democratic importance’ and ‘journalistic content’.


All clear so far?

The Bill creates a web of overlapping duties and definitions. Even though it runs to over 250 pages, it still leaves a lot of uncertainty as to how exactly platforms will be expected to fulfil their duties.

Enter Ofcom – which will be charged with producing codes of conduct and guidance telling platforms how they should carry out risk assessments of harm to adult and child users, how they should operate systems and processes to remove illegal content and how they can comply with the duties in respect of freedom of speech, journalistic content and content of democratic importance. Compliance with the Ofcom codes of conduct will be taken as compliance with the law – so Ofcom will have great influence on what we will read and see online. Platforms will not wish to risk investigation and enforcement by the regulator, which can lead to huge fines, criminal sanctions for senior management, and even blocking and disrupting the business of non-compliant services.

The Secretary of State for Digital, Culture, Media and Sport in post at the time will have the power to set Ofcom’s strategic priorities and require modification of codes of practice. They will also, through secondary legislation, set the criteria for Category 1 services and designate priority illegal content and priority content that is harmful to children and adults (the Bill is expressly intended to reduce harmful content online, not just illegal material).

The Bill also introduces new harmful communications offences, replacing Section 127 of the Communications Act. Many will welcome the demise of s.127 which has seen people prosecuted for racist tweetsjokes about Captain Tom, and using the wrong pronouns to address a transgender person. The new offences however, based on intentionally sending harmful or false communications, in tandem with the removal duties on platforms, could well make matters worse. At least when those offences were investigated the defendant could defend themselves and the courts had the full context. Online platforms will be acting as the judge of users’ content and search results, and taking down content that they reasonably believe to be illegal – a low threshold backed by threats of regulatory sanctions if Ofcom doesn’t consider their systems to be effective.

The way the Bill seeks to protect children is also not without its problems. There is no question that adults need to do more to protect children using the internet. But the Bill outsources the heavy hand of the state on to the digital platforms, effectively requiring age-gating of the most commonly used services. To differentiate between the version of a platform that has been made suitable for children, and the version that will still be available for adults, platforms will have to verify users’ ages. Under-18s will have content that platforms think might be harmful to their age group censored at source.

The impact assessment accompanying the Bill rather conservatively estimates the cost of implementing it at £2.5bn over 10 years, based on some rather implausible estimates of the cost of legal advice and the resource firms will need to dedicate to compliance. The government estimates 25,000 firms will be in scope – echoing the GDPR in impeding the ability of startups and challengers to compete with Google and Facebook. This new set of burdens will be introduced just as the government is seeking to prioritise competition in digital markets with a dedicated unit in the Competition and Markets Authority, and to reform the GDPR to better support innovation.

Users looking forward to the end of cookie popups need to steel themselves for the new wave of age verification and terms and conditions to accept before doing a quick web search.

 

Head of Regulatory Affairs

Victoria joined the IEA’s International Trade and Competition Unit in Spring 2018. She is a lawyer and practiced for 12 years in the fields of technology and financial services, before joining the Legatum Institute Special Trade Commission to focus on trade and regulatory policy. She has published work on the implications and opportunities of Brexit in financial services and movement of goods and the issues in connection with the Irish border. Before entering the legal profession Victoria worked for Procter & Gamble in the UK and Germany.


Leave a Reply

Your email address will not be published.


SIGN UP FOR IEA EMAILS