Youtube Technology Brand Safety

SuperAwesome vows 'contextual approach' to brand safety for kids with new targeting tool

Author

By Rebecca Stewart, Trends Editor

March 31, 2017 | 3 min read

Self-styled 'kid-tech' marketing platform SuperAwesome has unveiled a new tool designed to provide advertisers with a deeper understanding of what exactly young people are engaging with online in order to ensure brand safety – a topic that is increasingly coming to the fore in the media landscape.

BRAND SAFETY KIDS

SuperAwesome unveils new targeting tool and adopts 'contextual approach' to ensure brand safety for kids

The outfit, which runs a child-friendly content platform, works with the likes of Cartoon Network, Hasbro and Lego, with its new feature – Awesome Content Targeting (ACT) – expanding upon its current "contextual understanding" capabilities. This is made possible by analyzing the billions of pieces of content consumed by children on the channel to then build up a picture of the digital kids' media landscape.

The company claims the machine learning behind ACT will provide "safe new revenue streams" for content owners, as well as more meaningful, privacy-compliant engagement options for advertisers. The system matches kid-appropriate advertising with relevant pieces of content across the internet.

According to SuperAwesome, the contextual element of the service means that ads are never targeted in the wrong surroundings; an issue which is becoming more pertinent for the wider industry in light of revelations that ads are still appearing next to inappropriate content on YouTube's core site.

Google has been moving to quell the problem, but YouTube Kids, the video platform's youth-focused spin-off, has remained untouched by the furore as it only shows branded channel pre-roll ads as well as and provides parents and guardians greater control over filters.

SuperAwesome is compliant with the Children’s Online Privacy and Protection Act in the US and Europe’s General Data Protection Regulation.

Just earlier this week the BBC revealed it found "hundreds" of false versions of popular cartoons made by creators on YouTube's main platform that explored inappropriate themes. The report claims the mature content, featuring characters from Peppa Pig, the Minions franchise and Disney's Frozen, had generated millions of views.

In response, YouTube said it appreciated people drawing "problematic content" to its attention, adding that it made it easy for viewers to flag such videos. The company also suggested that parents use the YouTube Kids app as an alternative to the main site.

Youtube Technology Brand Safety

More from Youtube

View all

Trending

Industry insights

View all
Add your own content +