Google Technology Brand Safety

Social media platforms could face 'meaningful fines' for failing to swiftly remove extremist content

Author

By Tony Connelly, Sports Marketing Reporter

May 1, 2017 | 3 min read

Social media companies have been accused of putting profit before safety by MPs who say they should face heavy penalties for failing to swiftly remove extremist material from their platforms.

Youtube brand safety

The inquiry says multinational firms remove copyright infringements far quicker than extremist material

A new report by the Commons home affairs committee has called for "meaningful fines" for social media firms which fail to remove inappropriate content within a strict timeframe.

The inquiry said social media companies were more concerned with commercial risks than public protection, pointing out that content which infringes copyright rules is swiftly removed whereas a “laissez-faire” approach is adopted when it involves hateful or illegal content.

“Social media companies currently face almost no penalties for failing to remove illegal content,” the report said. “We recommend that the government consult on a system of escalating sanctions, to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.”

Yvette Cooper, the Labour MP who chairs the home affairs committee, said: “They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so. They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe."

The committee's report also touched on brand safety issues, singling out Google for its failure to prevent advertising from some of the world's largest companies appearing next to YouTube videos posted by extremists.

“One of the world’s largest companies has profited from hatred and has allowed itself to be a platform from which extremists have generated revenue," said the report.

The investigation highlighted instances of terror recruitment videos from jihadi and neo-Nazi groups which remained accessible online after MPs had complained about them.

Some of the material included antisemitic, hate-crime attacks on MPs that had been the subject of a previous committee report. Material encouraging child abuse and sexual images of children was also not removed, despite journalists flagging them up to the social media platforms.

The inquiry ultimately called for social media companies to be treated as though they are traditional publishers and compared their responsibility to police their platforms to that of football clubs who are obliged to pay for policing in their stadiums and surrounding areas on match days.

Google Technology Brand Safety

More from Google

View all

Trending

Industry insights

View all
Add your own content +