Technology Instagram

Instagram, Facebook and more face kid safety crackdown from MPs

Author

By John Glenday, Reporter

February 5, 2019 | 4 min read

Instagram, Facebook, Snapchat and more face being forced by law to remove illegal content and sign a code of conduct protecting vulnerable users, including children.

Instagram, Facebook and more face kid safety crackdown from MPs

Advertisers have already reacted with alarm to the news their own content was being surfaced alongside suicidal imagery / Unsplash

It's expected that the UK's culture and digital minister Margot James will announce a compulsory code of conduct for tech giants on Tuesday (5 February) following on from a BBC investigation around teen Molly Russell, who took her own life after viewing distressing material about depression and suicide on Instagram.

Details of the code have yet to be revealed, but several reports say James will use her speech to be given at a Safer Internet Day conference to initiate a policy paper and consultation ahead of introducing the new regulatory regime.

Advertisers have reacted with alarm to the news their own content was being surfaced alongside suicidal imagery. Household names including M&S, The Post Office and the British Heart Foundation were found last month to have been been inadvertently linked to such inappropriate material.

Trade body Isba has already demanded that an independent, industry funded body be established to certify content appropriate for advertising.

A spokesman for the Department for Digital, Culture, Media and Sport said: “We have heard calls for an internet regulator and to place a statutory ‘duty of care’ on platforms and are seriously considering all options.

“Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people. Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not.”

In an open letter printed in The Telegraph this week, Adam Mosseri, head of Instagram, admitted: "We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe. To be very clear, we do not allow posts that promote or encourage suicide or self-harm."

Instagram already uses engineers and reviewers to make it more difficult for people to source self-harm images. More recently, it's been applying 'sensitivity screens' to blur these pictures.

The Facebook-owned app is stopping short of removing 'cutting' images entirely, though. "We still allow people to share that they are struggling even if that content no longer shows up in search, hashtags or account recommendations”, wrote Mosseri.

However, it does plan to offer greater support to people who are struggling with self-harm or suicide by connecting them with organisations like the Samaritans.

Last month, the firm's newly-installed head of communications, Nick Clegg, said Facebook would "look [at the issue] from top to bottom and change everything we're doing if necessary, to get it right."

"We're already taking steps soon to blur images, block a number of hashtags that have come to light, and thirdly to continue to work... with the Samaritans and other organisations," he said.

Technology Instagram

Content created with:

Instagram

Instagram is a mobile, desktop, and Internet-based photo-sharing application and service that allows users to share pictures and videos either publicly, or privately...

Find out more

More from Technology

View all

Trending

Industry insights

View all
Add your own content +