Skip to content

Online Safety Bill – Latest Incarnation – 3rd Feb 2023

-- 5 min read --

The UK’s Online Safety Bill arrived in the House of Lords this week for its Second Reading.

At this stage we have a debate about the general principles of a proposed new law before discussing it line-by-line in the Committee and Report stages.

There were over 60 peers wishing to speak and we were advised to keep our remarks to 4 minutes so that the debate would not run into the night.

I have included the text of my short contribution in this post and it is also available as a video from the official Parliament Live recording service.

If you follow the video link you can scroll back and forth through the whole debate where most peers were focused on child safety and pornography.

This Bill has progressed slowly and uncertainly through Parliament with several changes of Government leadership but the shape of what we can expect to happen is now becoming clearer.

The Lords will look at in detail during March/April/May so that it might return to the Commons in June/July for potential completion before the summer break.

There will then be a period of intense work over several months drafting all the subsidiary regulations, codes of practice and guidance that are needed to translate the Bill’s aims into actions by OFCOM and online services.

We should not see this legislation as a single event, ie there is one world before it comes into force and a different one afterwards, but rather as an ongoing process of incremental changes over time.

If it is successful then we will be able to show that people in the UK feel increasingly safe when using the kinds of online services covered by the law, looking at indicators over several years to get a true picture of whether or not this is the case.

A useful analogy is that of road safety where millions of people value the freedom to travel in their personal cars even if this creates a range of harms that could affect them and their fellow citizens.

Our roads have been made safer over the years through technical and regulatory measures all aimed at incremental improvements including speed limits, vehicle test certificates, and mandatory new safety equipment.

So with internet services, the hope is the cumulative effect of a range of changes, both those required by this new law and technical innovations, will lead to data showing that online harms are reducing over time.

I had a conversation with my friend Nicklas Lundblad about what we can expect from the Bill and some of the remaining areas of contention that we recorded as a podcast – maybe worth a listen if you are into the Bill and are going for a long walk or drive.

There will be plenty more to follow on this Bill as it enters its final stages but here are my short words from this latest stage.

My Lords, I have two observations, two pleas, one offer of help and four minutes to deliver all this, so here goes.

Observation one is that this Bill is our answer to the age-old question of “quis custodiet ipsos custodes?” or, in the vernacular, “Who watches the watchmen?” With several thousand strokes of the pen, Parliament is granting to itself the power to tell tens of thousands of online services how they should manage their platforms if they wish to access the UK market. Parliament will give directions to Ofcom about the outcomes it wants to see and Ofcom will translate these into detailed instructions and ensure compliance through a team of several hundred people that the platforms will pay for. In-scope services will be given a choice—pay up and follow Ofcom’s instructions or get out of the UK market. We are awarding ourselves significant superpowers in this Bill, and with power comes great scrutiny as I am sure will happen in this House.

My second observation is that regulating online content is hard. It is hard because of scale. If regulating traditional media is like air traffic controllers managing a few thousand flights passing over the UK each day, then regulating social media is more like trying to control all the 30 million private cars that have access to UK roads. It is hard because it requires judgment. For many types of speech there is not a bright line between what is legal and illegal so you have to work on the basis of likelihoods and not certainties. It is hard because it requires trade-offs—processes designed to remove “bad” content will invariably catch some “good” content and you have to decide on the right balance between precision and recall for any particular system, and the noble Baroness, Lady Anderson of Stoke-on-Trent, has already referred to some of these challenges with specific examples.

I make this observation not to try and elicit any sympathy for online services, but rather some sympathy for Ofcom as we assign it the most challenging of tasks. This brings me to my first plea, which is that we allow Ofcom to make decisions about what constitutes compliance with the duties of care in the Bill without others second-guessing it. Because judgments and trade-offs are a necessary part of content moderation, there will always be people who take opposing views on where lines should have been drawn. These views may come from individuals, civil society or even Ministers and may form important and valuable input for Ofcom’s deliberations. But we should avoid creating mechanisms that would lead to competing and potentially conflicting definitions of compliance emerging. One chain of command—Parliament to Ofcom to the platforms—is best for accountability and effective regulation.

My second plea is for us to avoid cookie banner syndrome. The pop-ups that we all click on when visiting websites are not there for any technical reason but because of a regulatory requirement. Their origins lie in a last-minute amendment to the e-privacy directive from Members of the European Parliament who had concerns about online behavioural advertising. In practice, they have had little impact on advertising while costing many millions and leaving most users at best mildly irritated and at worst in greater risk as they learn to click through anything to close banners and get to websites.

There are several elements in this Bill that are at risk of cookie banner syndrome. Measures such as age and identity verification and content controls can be useful if done well but could also be expensive and ineffective if we mandate solutions that look good on paper but do not work in practice. If you see me mouthing “cookies” at you as we discuss the Bill, please do not see it as an offer of American biscuits but as a flag that we may be about to make an expensive mistake.

This brings to me to my final point, which is an offer of technical advice for any noble Lords trying to understand how the Bill will work in practice: my door and inbox are always open. I have spent 25 years working on internet regulation as poacher turned gamekeeper, turned poacher, turned gamekeeper. I may have a little more sympathy with the poachers than most politicians, but I am all gamekeeper now and keen to see this Bill become law. For those who like this kind of thing, I share more extensive thoughts on the Bill than I can get into four minutes in a blog and podcast called “Regulate Tech”.

House of Lords Hansard, 1st Feb 2023
Leave a Reply

Your email address will not be published. Required fields are marked *