Online freedom is in peril
The freedom to provide and use digital services is under sustained attack. The EU’s new Copyright Directive places a heavy burden on website operators who make user-generated content available — compelling them to filter suspected infringing content, in a misplaced attempt to turn back the clock and protect rights holders and newspapers from new ways of monetising and using content. But while this assault on freedom of expression on the internet in the cause of copyright is troubling, the latest proposed regulatory interventions in the UK are even more insidious.
This is not least because they are framed in terms of protecting children and the vulnerable from harm, purporting to be in these people’s best interests. What kind of monster could oppose cracking down on child sexual exploitation, terrorism, and cyber bullying? Who could oppose protecting and supporting children’s health and wellbeing? The Online Harms White Paper, published by DDCMS and the Home Office, aims to “make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services”. And Age Appropriate Design: A Code of Practice for Online Services, published for consultation by the Information Commissioner’s Office (ICO), has the stated purpose of providing practical guidance on “how to design data protection safeguards into online services to ensure they are appropriate for use by, and meet the development needs of, children”.
The White Paper expressly seeks to address not just illegal activity, but “unacceptable content and activity”. The obvious response to this is: unacceptable to whom? An independent regulator — expressing its views by way of codes of practice, and enforcing them by way of fines or even by blocking the services that transgress the boundaries of acceptability — is the answer. This would be a huge expansion of the powers of the state over every aspect of private and economic life, so much of which is carried out online. The White Paper aims to regulate not only content-sharing platforms, but also private communications, such as messaging and file storage applications.
Similarly, the extension of the data protection regulator’s powers — acting within its remit under the Data Protection Act, to create a duty for businesses to safeguard and support children, effectively legislating and enforcing the UN Convention on the Rights of the Child (UNCRC), which it quotes liberally in the Code of Practice — is astonishing regulatory overreach. All websites will need to comply with the Code, unless the operator can prove that their website won’t be accessed by under 18s, or they implement robust age verification to ensure that under-age access will not be possible.
Is the position of the government and its regulators that parents cannot protect their children from online harms, and that they therefore shouldn’t have to — that only the state, in tandem with businesses, has the capability to do so? It seems hubristic and self-serving to justify the accretion of wide-ranging and ill-defined powers to monitor and enforce ‘acceptability’. Or is the government’s position that parents don’t want to supervise and protect their children in their use of online services — that protecting children from harm, even harm encountered in their own home in the presence of their parents, or harms deriving from legal conduct, should essentially be a service provided by government?
Sadly, of course, not all children have parents and carers who exercise these responsibilities. But is the best or only way to protect them to sacrifice freedom of expression and a dynamic online environment? Are we accepting that the role of the state — and technology companies, like Google and Facebook — is to monitor and edit what children and adults can view or read on the internet? These are questions that the White Paper and the Code of Practice have answered in the affirmative, but it is far from clear that effectiveness or further implications were taken into consideration. If it becomes accepted that keeping children, and even adults, safe in their use of the internet is a function of the state, the incentive and duty for parents to do so will be diminished, and children will grow up with the assumption that this is how it works. The justification for ever greater state intervention will grow and become ever more difficult to row back from.
The Code of Practice does not purport to demonstrate that its provisions are supported by evidence. Rather, the ICO focuses on fulfilling the requirements of the UK’s Data Protection Act, the General Data Protection Regulation, and, somewhat grandiloquently, the UNCRC (quoting at length the convention’s passages about parental duties, which it seems to appropriate to itself). The White Paper cites various surveys and studies — on occasion, unabashedly including those that specifically contradict its approach — but none that indicate that this hugely ambitious and authoritarian regulatory regime could be expected to counter any of the harms it bemoans.
In terms of freedom of expression, this is oppressive and paternalistic, and, in respect of the protections from misinformation it proposes, it is also snobbish and partisan. It aims to protect vulnerable, ignorant users (i.e. everyone) from inaccurate or untrustworthy new sources. The press, some of whom have been enthusiastic supporters of the White Paper, should be worried.
There is a strong Luddite flavour to the White Paper. It operates under an assumption that new technologies are not to be trusted and need to be controlled by the state, together with a confidence that the state is capable of managing the digital economy with a powerful regulator of all content. The authors seem assured that new regulations will not be easily subverted by clever teenagers and determined criminals using VPNs, or by corporates simply exiting the UK market. They also seem to assume that simply by saying the words, ‘our vision […] is for a thriving digital economy,’ the thriving digital economy will persist — even as the government passes ever more intrusive and anti-competitive legislation. In fact, the experience has been that digital service providers will exit markets where the regulatory burden and risks are too great, and that large, established operators benefit most, depriving consumers of content, services, and innovation.
At a time when the government is struggling to deliver key policies, regulating the internet (which was, after all, a manifesto commitment) seems like a costless way to shore up support on the traditionally conservative issues of the family, and law and order. In reality, the costs of these policies to our freedom would be substantial and far-reaching.
This article was first published as part of a collection of essays on social freedom by Freer.