In a move that many would call long overdue, the official UK regulator for nearly all communications services, Ofcom, has been appointed as the sole regulator for 'online harms'. This itself coming off the back of last April's DCMS report on the same topic.
The powers it is being given are unclear; the penalties that would follow lack clarity; and Ofcom itself is arguably woefully under-resourced to deal with the size of the task at hand.
But make no bones about it - this is a large positive step in the right direction.
The main thrust of the press today talks about placing the duty of care of users onto the platform owners. Platforms that allow the sharing of UGC. From Facebook to Forums and every Reddit in-between.
Duty of care for Ofcom’s brief is about keeping users safe and tackling harmful and/or illegal content and activity. The latter means terrorism. Child sexual exploitation and abuse. Dangerous content that should be shut down and prevented from dissemination completely. That is clear. Harmful content on the other hand – that covers everything from bullying to images of self-harm and suicide. Young adults and children being highlighted as being at risk of exposure to this content. There will be grey areas – and they will need to be tested.
As a side note: you can – and should - find more detail under Chapter 7 of the Online Harms White Paper 'Fulfilling the duty of care' - which outlines specifically what else this covers and how.
The platforms will tell you they are already doing this - and the effort thus far should be applauded. But with no legislation or regulation to ensure that this is executed well and at scale, the platforms have had no policing over their commitment or their methods.
Last week the Huffington Post highlighted the ongoing issue of vaccine-related misinformation appearing on Instagram. March 2020 marks a year since Facebook committed to reducing it and yet still it remains - at scale. This is one of myriad issues that the platform owner should be increasingly investing in and, if deemed not to be doing so sufficiently enough, should have a government to answer to when it doesn't.
This time last year I wrote that the politicians were coming. That the difference between what Facebook says and what Facebook does is often vast with varying levels of opacity.
Having one regulator to watch over it all is a good thing. It keeps things clean and simple. This time last year Mark Zuckerberg was calling for stronger regulation of the internet. This is the first step towards that regulation and one wonders how his lobbyists will react.
Today’s announcement should come as no surprise; it has been telegraphed for months – if not years.
I congratulate Ofcom on its new role and I wish it luck. It’ll need it.
James Whatley is a strategy partner at Digitas UK.