The Online Safety Bill is a mystery wrapped in a riddle inside an enigma
Mark Leiser breaks down the problems with the UK government’s Online Safety Bill as part of The Drum’s Data and Privacy Deep Dive.
Online Safety Bill / Government website: gov.uk
There are few legislative initiatives that are as puzzling as the UK government’s quest to bring new safety measures to protect users from all that is evil on the internet into force. Ensuring illegal content is handled appropriately by platforms seems axiomatic, but creating a legal obligation to remove “legal, but harmful“ content was always destined to cause a stramash between free expression purists and, say, advocates for additional protection from cyber-bullying. The Online Safety Bill (OSB) brought out both the best and the worst of cyber actors that have a stake in the way social media platforms are regulated.
Sticking Nadine Dorries up front as the spokesperson for such a controversial piece of legislation might not have been wisest tactical decision Boris Johnson made (watch the MP for Mid Bedfordshire get dismantled for sending abusive tweets here), but when Johnson left 10 Downing Street so did Dorries. The new culture secretary, Michelle Donelan, promised to remove the “legal, but harmful” provisions found in the Dorries version of the OSB and replace them with new measures designed to place greater emphasis on free expression.
Meanwhile, former DCMS chair and Zuckerburg agitator Damian Collins has promised not to scrap these completely, instead opting to leave them in some “mutilated form”. With the bill due to have its third reading in the Commons on November 1, it was removed from Commons business. And with Johnson’s departure and Liz Truss’s entry and then hasty departure, the mess has now landed on Rushi Sunak’s desk.
The new PM has the unenviable task of deciding whether free expression and all of its consequences are worth curtailing in order to keep children safe from the abuses they likely encounter via social media platforms, while ministers are suggesting that continued delays could require the government to scrap the legislation entirely. Even the House of Lords is getting in on the action: to ensure it can pass this session, the Lord Speaker granted an urgent question on the future of the OSB, following the Prevention of Future Deaths report penned at the conclusion of the inquest into the death of Molly Russell.
Continued delays to the OSB have perplexed nearly everyone. Although the bill had been heavily criticized for its disregard for free expression rights, it had the political backing of nearly every politician that mattered. Only Johnson’s implosion and subsequent removal from 10 Downing Street managed to derail the imposition of a wide range of obligations: duties of care, accountability measures, transparency rules and platform oversight by Ofcom.
This derailment created a vacuum of criticism to such an extent that the OSB might actually be shelved for another day. Even if it does manage to pass through parliament, we are a long ways from any real understanding of its implementation: the bill’s whopping 218 pages will need to be fleshed out by secondary legislation, codes of practice, Ofcom guidance and designated categories of service providers – each with its own step-by-step procedures. All of which will be necessary to figure out how to impose a legal duty on platforms to police speech that may cause harm.
The legal obligation to remove ”legal, but harmful” content, alongside a slew of new criminal offenses, remains at the heart of the controversy surrounding the bill. As tech lawyer Graham Smith rightly points out, this new offense creates a veto for those distressed by views they regard as repugnant, even if that reaction is unreasonable. The risk of chilling legal speech is exacerbated when the offense is combined with the illegality duty that the bill, in its present form, would impose on platforms and search engines.
Despite numerous laws already in place to tackle some of the identified harms and numerous laws regulating content, actions and behavior, the OSB passes the government’s own policing responsibilities on to platforms; in other words, ’it’s your platform, so you deal with it’. The ethos of the OSB is simple: platforms are no different from theme parks, offices and restaurants; therefore, as platforms are places where people gather, the imposition of a duty of care will work well between platforms and users there too.
Risk-based legal regimes such as the UK’s Health and Safety Act have successfully deployed a duty of care before. But this regime creates duties of care over physical environments. One can build in the cost of slipping on a wet floor into everything from warning signs to business insurance. In the online world, it is different. Once something is priced, the imposition of a duty of care establishes a transaction cost for content. Speech deemed too costly for the platform will be filtered, blocked and/or removed ex ante rather than ex post, especially when the uncertainty surrounding content is determined to have too high a transaction cost, regardless of its actual risk. In other domains, the imposition of a duty of care mitigates the distribution costs of uncertainty through legal conventions. It remains unclear how a law can protect free expression when some of that expression will remain risky to some of the people likely to encounter it.
The Online Safety Bill remains a mystery wrapped in a riddle inside an enigma.
Dr Mark Leiser is a professor of technology law at the Vrije University in Amsterdam. For more on how the world of data-driven advertising and marketing is evolving, check out our latest Deep Dive.