Media Brand Safety

It's time we all demand that ethics catch up with technology in modern advertising

By Jake Dubbins, Managing Director

February 26, 2019 | 6 min read

The initial trickle of bad news stories for the tech platforms has, over the past 12 months, become a full blown torrent. Every single week sees another story that is not confined to the industry press but in the national news. These are not industry issues; these are problems that are endangering our children and the most vulnerable and pulling at the very fabric of our society.

Photo by Leon Bublitz on Unsplash

/ Photo by Leon Bublitz on Unsplash

The latest scandal, which has seen brands like Nestle, Disney and Epic Games (the company behind Fortnite) pull their spend from YouTube, is over child safety. Blogger Matt Watson has uncovered comments by some YouTube users identifying precise times within videos where underage girls appear in compromising positions that can be seen as sexual. Is that your daughter? Your niece? Your granddaughter?

Of course, this is on top of The Times investigation in late 2017 which found that big brands were inadvertently funding terrorist content on YouTube. And after another Times investigation which found that brands were appearing next to child abuse videos on YouTube. And after brands were found advertising next to content that showed graphic self-harm and suicide content on Instagram.

When these issues arise, we wring our hands and shake our heads, but the language we mainly use is ‘brand safety’. How can we keep our brand safe?

There is no doubt that we need to ensure that the brands we all work on are operating in a ‘safe’ online environment. Jobs, livelihoods and families depend on it. The question we rarely ask is what about human safety?

It is an inconvenient and uncomfortable truth that brands, enabled by programmatic advertising and globalised tech platforms, are inadvertently but repeatedly funding child exploitation. This is on our watch. Many of us have children. Many of us are working out how to approach our own children’s access to the internet. How can we ensure they are not exposed to sexualised content, self-harm videos and at worst, individuals that aim to groom them?

What about the children in these videos? Will they grow up to know that they were being watched and exploited? How will that shape their lives? How will that affect their mental health? Their relationships?

This is a problem that is systemic and without precedence in scale.

There is no doubt that since the scandals in 2017 the platforms have engaged to varying degrees in efforts to make them more accountable and more transparent. Facebook has massively increased the number of moderators removing offensive content. In March 2018, YouTube received Jicwebs brand safety certification. This does not guarantee that YouTube is brand safe, as has been shown this week, but it does ensure increased levels of accountability and transparency.

It is not enough and some have had enough. Phil Smith, director general at advertiser association Isba, has been calling for an independent oversight body to be established to regulate and monitor content across all the social media and tech platforms.

Harriet Kingaby and I set up the Conscious Advertising Network in 2018 with a mission to help the ethics catch up with the technology of modern advertising. We fundamentally believe that brands must become more proactive in their advertising ecosystem rather than reacting to events like this week.

CAN is a voluntary coalition of over 30 organisations including The Body Shop, Accenture Interactive, Gyro, the7stars, ecover, the AAR, Creative Equals, Merkle Periscopix and Jicwebs to name but a few. We have worked with experts across the industry to write six manifestos on children’s wellbeing, fake news, hate speech, diversity in content, informed consent and ad fraud. Isba has formally backed the project and all six manifestos.

Our approach is pragmatic, but aims for system change. Our goal is for all brands to incorporate the principles of the manifestos into all agency briefs and RFPs. If brands are demanding that their agency ecosystem is discussing ethics as well as numbers then we be much further ahead in the journey to a more sustainable and kinder digital environment.

CAN is not a panacea. Once a brand signs up it will not automatically guarantee that the next day their ads do not appear next to offensive content. What it will mean is that every time the marketing team behind that brand briefs its agencies they will be asking questions about child safety, hate speech and fake news.

This is not about YouTube, Instagram or Facebook. In five years TikTok may be bigger than them all. The difference will be that in five years, brands will be setting the ethical agenda, and the power of their money will help shape the very creation of the next generation of tech platforms.

The advertising industry has huge power. Advertising to a large extent funds the internet. It funds YouTube, Facebook and Instagram. With that power comes responsibility. We can no longer avoid these issues. They are real with real world harm to real people. We can sit back again and criticise and do very little or we can act to demand that the ethics catch up with the technology of modern advertising. We will not solve the problems overnight but we can start that journey right now.

Jake Dubbins is managing director at Media Bounty and co-founder of the Conscious Advertising Network

Media Brand Safety

More from Media

View all

Trending

Industry insights

View all
Add your own content +