Regulation Media

5 things you need to know as Ofcom takes on social media governance

Author

By John Glenday, Reporter

February 12, 2020 | 4 min read

The UK government is set to give Ofcom more power to regulate social media companies over harmful content after a public consultation on the issue.

5 things you need to know as Ofcom take on social media governance

5 things you need to know as Ofcom take on social media governance

Designed to bring rules and regulations into line with recent technological advances, the move seeks to protect children and vulnerable members of society without stifling business.

It will give the watchdog an expanded remit to enforce a legal "duty of care" from firms like Facebook, Instagram and YouTube.

Home Secretary Priti Patel, explained: “While the internet can be used to connect people and drive innovation, we know it can also be a hiding place for criminals, including paedophiles, to cause immense harm.

“It is incumbent on tech firms to balance issues of privacy and technological advances with child protection. That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users.”

But what does this mean for the future of social media governance in practice? Here are the five key takeaways from the announcement.

Swift removal of illegal content

Key to the appointment would be a mandate that web platforms remove illegal content, particularly material relating to terrorism and paedophilia, quickly and minimising the risk of it appearing with the threat of sanctions reserved for those who fail to comply.

Google, YouTube, and Facebook have all invested heavily in recent years into improved content moderation systems, which largely rely on artificial intelligence to remove harmful content though an increasing number of human moderators are being employed to ensure the net is tightened. However, harmful content continues to slip through and spread quickly, a problem highlighted last year when a terrorist attack in Christchurch, New Zealand, was broadcast and shared millions of times on Facebook before it was removed.

It won’t block ‘offensive’ content

While tough action will be taken against extreme content the government has committed itself to permit adults to continue accessing legal content which some might find offensive, such as pornography, so long as companies explicitly state their terms and conditions of use.

Instead, it will be able to ask internet service providers (ISPs) to block websites or apps which commit “serious, repeated and egregious violations” of their duty of care.

Protecting online users rights

Another important step could see Ofcom empowered to protect online users’ rights by safeguarding free speech, promoting new technology and ensuring businesses are not unduly impacted. Critically the regulation would only apply to companies which permit sharing user-generated content.

Clear guidance

Businesses which suspect they may fall within the scope of these new regulations will be offered guidance as to whether they will be impacted, with the government stressing that a simple social media presence does not in itself indicate responsibility.

Flexibility

Finally, the government has indicated a degree of flexibility in the interpretation of these regulations to provide the necessary freedom to adapt and respond to emerging harms hand technologies.

As such, there has been no decision on what punishments or fines the regulator could potentially hand out. However, The Telegraph reports that the new could have powers to fine the tech giants potentially up to 4% of global turnover.

To clear a path for transition to the new regulatory model Lord Burns has agreed to step down to allow a new chair of Ofcom to be appointed later this year to work alongside new chief executive Dame Melanie Dawes.

Regulation Media

Content created with:

More from Regulation

View all

Trending

Industry insights

View all
Add your own content +