The suicide of 15-year-old Tallulah Wilson brought the problem of destructive content and vulnerable young people back to the forefront of the news this week after the teen’s mother called for advertisers to begin a boycott.
Sarah Wilson hit out at websites such as Tumblr for allowing content that promoted and encouraged self harm and suicidal behaviour among users and urged advertisers to boycott them.
Before her death in October 2012, Tallulah Wilson’s family discovered she had been posting images of her self harm on Tumblr. Her mother described how her daughter fell into the “clutches of a toxic digital world” in which self-destructive and suicidal behaviour was encouraged.
An advertiser boycott of websites hosting inappropriate content in a bid to pressure moderators and owners into taking more responsibility has happened before. Specsavers, Laura Ashley, Vodafone and Save the Children were among brands that pulled advertising from ask.fm last year following the suicide of 14-year-old Hannah Smith.
In light of Sarah Wilson’s boycott call, The Drum spoke to ISBA, media lawyer Steve Kuncewicz, Childine and selfharm.co.uk – two organisations at the frontline of dealing with the emerging digital problem – to find out more about the issues facing them.
There is a lot in the media at the moment about banning sites and shutting down different forums. That's easier said than done because it's too easy for the sites to be recreated under different names, and it's actually only half the problem. These sites operate on a supply and demand basis - if no-one was using them they would cease to exist, so the actual challenge is for society to be working towards ensuring our children and young people are growing up to be emotionally self-aware and have the opportunities to seek support as needed - that might mean expanding statutory services to ensure those who need help get it immediately rather than having to wait several weeks. It might mean schools adopting a scheme to support young people's emotional development on a more formal level. We want to see our young people develop a resilience and robustness to not need to go searching for these online communities in the first place.
Websites have to tread a fine line between allowing freedom of expression and providing the necessary safeguards to keep users safe. This is much easier to do on sites that are specifically created for children and young people, but obviously much harder to police on unrestricted sites. There is always more that websites can do to ensure their users are engaging with material that is not deemed harmful, but really the only way they will be able to do this effectively is if material considered pro-self harm or pro-eating disorders or pro-suicide is banned and made illegal. Even then, there will always be grey areas, because it's open to interpretation.
Tallulah Wilson’s tragic death is very sadly the latest exposure of what many quite rightly see as the dark underbelly of social media being dragged into the national consciousness. The legal position on the potential liability of sites like Tumblr for content of this nature is unlikely to give any comfort to those calling for action against sites or social networks which host content that promotes or encourages suicide – it’s an offence for a person to do anything which is capable of encouraging or assisting another person to attempt to commit suicide provided that intent to encourage or assist in a suicide can be shown. Anyone committing the offence need not know or be able to identify the other person, and the offence applies whether or not the person in question goes through with their attempt.
As far as I know, there has never been a successful prosecution in the UK for promoting suicide online. In any event, the position of platforms and websites is protected by the Electronic Commerce Regulations 2002, which provide that service providers will not usually be liable for any criminal sanctions or damages as a result of hosting pro-suicide content where they do not have actual knowledge of the content being on their servers unless they are actually aware of it and take “expeditious” action to remove it once they are made aware.
It’s understandable that many will now demand either a boycott of Tumblr, action to be taken against them or for the platform to provide more active moderation. However, not every platform has the resources or technical ability to put such measures in place. As we’ve seen recently, some major social networks have put content filters in place to deter posting of this kind of material and made significant efforts to engage with the police to identify users responsible for posting criminal content. Hopefully a joint approach under the auspices of CEOP or an equivalent body would help to both clamp down on this kind of content and, most importantly, educate parents and vulnerable children of the potential dangers involved in exploring the online world without supervision.
For advertisers, the distinction between whether a website or page endorses self harm and suicide or whether it encourages bullying is largely irrelevant. Both are equally invidious to responsible brand owners who want nothing whatsoever to do with such toxic associations.
There are industry-agreed systems in place to prevent such juxtapositions. However, these necessarily place a heavy reliance on website owner/publisher participation, vigilance and compliance. Regrettably, the Tallulah Wilson example demonstrates just how little control advertisers can sometimes have over where their ads can appear online.
In our conversations with major social media channels, we have consistently called for a joined-up approach to better policing of their pages along with firm reassurances that advertisers’ brands will not appear inadvertently on inappropriate or offensive pages.
Advertisers invest significant sums promoting their brands and services and are the principal source of funding for the social media. They have no desire for things to backfire, in this instance by association with content which has seemingly contributed to what is an incredibly tragic incident.
In the same way that brands quite rightly voted with their wallets and withdrew from certain sites last year, we fully expect them to act similarly in this case. How a picture-based site like Tumblr manages to reassure advertisers remains to be seen, but firm and solemn reassurances are needed, and needed without hesitation or delay.
Even then, brands which have been unfortunate enough to have found themselves tainted by this particular incident are unlikely to feel that this is enough.
Last year self harm was mentioned in almost 47,000 counselling sessions with young people, a disturbing 41 per cent year-on-year increase. Childline also reports a very worrying 50 per cent rise in contacts about self harm specifically from 12 year olds – the highest increase of any age. Contacts where young people felt suicidal increased by 33 per cent, with over 4,500 contacts from children aged between the ages of 12 and 15 alone.
The case of Tallulah Wilson is a deeply tragic one. Young people who are self harming are often trying to cope with other problems. Triggers can be exam stress, bullying and feeling isolated and alone with no one to talk to. Self harm can be an attempt to relieve distress. Although self harm is not a new problem, sharing images of self-harm on social media sites is a worrying new development. I would urge young people seeking advice or support on this issue to contact Childline, which is available 24 hours a day, seven days a week on 0800 1111 or go to ChildLine.org.uk. Calls don’t appear on telephone bills.