Don’t let the Super Bowl distract you from big tech’s child safety interrogation
Nina Etienne urges marketers not to be distracted by Super Bowl mania at the expense of the US Senate’s big tech interrogation. It’s time we push for stronger protections for children online.
Hellmann’s ‘Mayo Cat’ ad
This week something incredibly significant took place in the world of advertising. No, it wasn’t Uber opening its bulging wallet and leveraging the ‘Working Class’ Beckham meme for the Super Bowl - or the ear-worm Mayo Cat. The topic we avoided as an industry was the senate’s grilling of tech bro CEOs regarding (the lack of) safeguards for children on their platforms.
Adland remains quiet in the wake of the hearing. Has the mayo-cat got our tongue [Ad of the Day]?
You’d have thought this would be an important topic for the world’s advertisers, who are funneling billions into these social media monoliths to sell their products. But I guess not.
Social media - as we all know - is big business. They make billions of dollars selling to children on their platforms too. According to a study released at the end of last year from Harvard University’s TH Chan School of Public Health (which used public survey and market research data from 2021 and 2022 to estimate youth users and related ad revenue):
YouTube derived €866.8m in ad revenue from users under 12 and €1.08bn from users aged 13-17.
Instagram derived €724m from users under 12 and €3.6bn from users aged 13-17.
Facebook derived €123.9m from users under 12.
TikTok derived €1.8bn from users aged 13-17.
Children are big business for social media sites and advertisers.
There’s a financial incentive to keep children on and engaged with social media platforms.
But what of our moral code?
For decades, there’s been an ethical question facing our industry over advertising to children who, as the American Academy of Pediatrics in 2020 stated, are uniquely vulnerable to the persuasive effects of advertising and content because of “immature critical thinking skills and impulse inhibition.”
Rules and regulations to protect children are not new - the ASA has prioritized it. With the passing of the much-discussed Online Safety Act in the UK last October, regulations are increasing. But how applicable are they to our new world of selling where thousands of up-and-coming new influencers flog our 10-year-old children anti-aging products on TikTok?
And before I digress too much into a discussion on child advertising, this wasn’t even the main focus of the recent senate hearing.
The topic was child safety. The topic was children being exposed to destructive sexual and violent content. The topic was a child being recommended to watch a live hanging in their ‘For You’ feed. Devastating stuff.
Now, I’m not saying that these tech giants haven’t been trying to prevent it. But this content is pervasive. I did some research last night, and let’s just say within just two minutes, I was able to stumble upon distasteful content (by clicking normal-looking posts). X? More like ‘X'-rated.
There’s evidence from many medical and scientific institutions that tell us that social media is bad for our children. Last year, Stem4, the youth mental health charity, surveyed 1,024 children and young people aged 12 to 21 years old and found that nearly half said they’d become withdrawn, started exercising excessively, stopped socializing completely or self-harmed because they were regularly bullied or trolled online about their physical appearance; four in 10 said they are in mental health distress.
In his testimony a few days ago, Facebook founder Mark Zuckerberg rejected the scientific evidence, stating that it hasn’t shown a causal link between using social media and young people having a decline in mental health.
The argument is nuanced. It always is. And there are multiple players.
The tech giants state that bad actors take advantage of their platforms much as they do in the physical world. And some how, that diminishes their accountability. There’s an echo of last year’s Netflix hit Painkiller in this argument. They blame Google and Apple for allowing children to download their apps outside of age restrictions. The legislators and lawmakers have been slow, falling prey to lobbyists. There’s a lot of noise with statistics and stories being rejected by tech executives as non-reflective of the real-life experience of users on the apps.
And with the ramp-up of AI and its ability to spread harmful content and misinformation at a mind-boggling pace, we can no longer wait. And if you don’t believe AI can be harmful, I’m sure Taylor Swift can share how negatively impactful deep fakes can be.
It’s apparent from last week’s spectacle that there remains a lack of safety online for the most vulnerable of audiences and a lack of agency on this topic from the tech leadership and legislative bodies. But what is also apparent is that advertisers - spending those billions of dollars - with their obvious silence on the topic- also deny their potential role as change makers.
The tech giants have an omnipotence. David and Goliath vibes. And we do need to deliver those acquisition and CAC targets in an economically fragile context. My question, though - which I feel we should all be talking about with the same fervor as those Superbowl ads - is: at what cost?
Can we still, in good conscience, pump obscene amounts of money into sites that are not able to protect - for whatever reason - our children?
Should marketers hold some culpability because our wallets are inciting social platforms to create environments that engage children?
Can we continue with the status quo when we know that the mental health of an entire generation of children and young people is in crisis?
And my final - and most important provocative question - can’t we advertisers play a role in the change?
After all, as an industry, we’ve consistently strived for innovation; we live to challenge the status quo and invert social norms. We’ve moved culture forward.
What more could we do to ensure our children are safe and protected in this new digital era? Now, I think that’s worth talking about. Let’s get creative about that.