The suicide of 15-year-old Tallulah Wilson brought the problem of destructive content and vulnerable young people back to the forefront of the news this week after the teen’s mother called for advertisers to begin a boycott.
Sarah Wilson hit out at websites such as Tumblr for allowing content that promoted and encouraged self harm and suicidal behaviour among users and urged advertisers to boycott them.
Before her death in October 2012, Tallulah Wilson’s family discovered she had been posting images of her self harm on Tumblr. Her mother described how her daughter fell into the “clutches of a toxic digital world” in which self-destructive and suicidal behaviour was encouraged.
An advertiser boycott of websites hosting inappropriate content in a bid to pressure moderators and owners into taking more responsibility has happened before. Specsavers, Laura Ashley, Vodafone and Save the Children were among brands that pulled advertising from ask.fm last year following the suicide of 14-year-old Hannah Smith.
In light of Sarah Wilson’s boycott call, The Drum spoke to ISBA, media lawyer Steve Kuncewicz, Childine and selfharm.co.uk – two organisations at the frontline of dealing with the emerging digital problem – to find out more about the issues facing them.
Rachel Welch, project director at Selfharm.co.uk
Websites have to tread a fine line between allowing freedom of expression and providing the necessary safeguards to keep users safe. This is much easier to do on sites that are specifically created for children and young people, but obviously much harder to police on unrestricted sites. There is always more that websites can do to ensure their users are engaging with material that is not deemed harmful, but really the only way they will be able to do this effectively is if material considered pro-self harm or pro-eating disorders or pro-suicide is banned and made illegal. Even then, there will always be grey areas, because it's open to interpretation.
Steve Kuncewicz, lawyer specialising in online media
As far as I know, there has never been a successful prosecution in the UK for promoting suicide online. In any event, the position of platforms and websites is protected by the Electronic Commerce Regulations 2002, which provide that service providers will not usually be liable for any criminal sanctions or damages as a result of hosting pro-suicide content where they do not have actual knowledge of the content being on their servers unless they are actually aware of it and take “expeditious” action to remove it once they are made aware.
It’s understandable that many will now demand either a boycott of Tumblr, action to be taken against them or for the platform to provide more active moderation. However, not every platform has the resources or technical ability to put such measures in place. As we’ve seen recently, some major social networks have put content filters in place to deter posting of this kind of material and made significant efforts to engage with the police to identify users responsible for posting criminal content. Hopefully a joint approach under the auspices of CEOP or an equivalent body would help to both clamp down on this kind of content and, most importantly, educate parents and vulnerable children of the potential dangers involved in exploring the online world without supervision.
Bob Wootton, director of media and advertising at ISBA, the voice of British advertisers
There are industry-agreed systems in place to prevent such juxtapositions. However, these necessarily place a heavy reliance on website owner/publisher participation, vigilance and compliance. Regrettably, the Tallulah Wilson example demonstrates just how little control advertisers can sometimes have over where their ads can appear online.
In our conversations with major social media channels, we have consistently called for a joined-up approach to better policing of their pages along with firm reassurances that advertisers’ brands will not appear inadvertently on inappropriate or offensive pages.
Advertisers invest significant sums promoting their brands and services and are the principal source of funding for the social media. They have no desire for things to backfire, in this instance by association with content which has seemingly contributed to what is an incredibly tragic incident.
In the same way that brands quite rightly voted with their wallets and withdrew from certain sites last year, we fully expect them to act similarly in this case. How a picture-based site like Tumblr manages to reassure advertisers remains to be seen, but firm and solemn reassurances are needed, and needed without hesitation or delay.
Even then, brands which have been unfortunate enough to have found themselves tainted by this particular incident are unlikely to feel that this is enough.
Sue Minto, head of Childline
The case of Tallulah Wilson is a deeply tragic one. Young people who are self harming are often trying to cope with other problems. Triggers can be exam stress, bullying and feeling isolated and alone with no one to talk to. Self harm can be an attempt to relieve distress. Although self harm is not a new problem, sharing images of self-harm on social media sites is a worrying new development. I would urge young people seeking advice or support on this issue to contact Childline, which is available 24 hours a day, seven days a week on 0800 1111 or go to ChildLine.org.uk. Calls don’t appear on telephone bills.