As US states move to enact social media restrictions for kids, ad experts explain impact
New legislative efforts in the US to protect kids from the potential dangers of using social media are generating mixed responses. Despite acknowledging new hurdles that may arise as a result, ad industry players recognize the intent behind such developments as an opportunity to adapt to a more privacy-centric future.
Florida is the latest state to advance a bill that seeks to institute new social media guardrails for kids / Adobe Stock
Florida lawmakers this week advanced a bill that would require social media companies to warn users under 18 that their platforms may negatively impact their mental health, have “addictive qualities” and present them with “unverified information.” It would also force platforms to share details about their content moderation policies and any potentially “addictive” design elements of their products.
The news comes just days after Utah’s Republican governor Spencer Cox signed into law two new controversial bills that put stringent restrictions on young people’s use of social media. In particular, one regulation will bar users under the age of 18 from having social media accounts without the explicit permission of their parent or guardian.
The developments, which have garnered support from many parents but have received backlash from a range of free speech interest groups – and even teens, who argued against the restrictive Utah law – are part of a larger pattern playing out in the US and globally. In short: children’s online safety and privacy is under the microscope.
Last week, TikTok chief executive Shou Zi Chew testified before Congress, where he was grilled by US lawmakers on the app’s data privacy and mental health risks for children and teens. Meanwhile, the Biden administration has made children’s online safety and privacy an explicit priority. And in the fall of last year, Congress urged the US Federal Trade Commission to expand the 1998 Children’s Online Privacy Protection Act (COPPA), which bans behavioral advertising, user profiling and retargeting on apps and websites designed for children ages 13 and under. The request came on the heels of California’s decision to follow in the footsteps of the UK and pass a set of standards that require all platforms to abide by specific children’s safety standards in the inherent design of their products and services.
And it’s not just lawmakers who are concerned; children’s digital safety a top priority for many parents today – 80% of US parents claim to be worried about their kids’ privacy online, according to recent survey data from privacy firm Pixalate. However, less than half of parents of children ages 13 and younger say they monitor their kids’ activities on apps daily, suggesting that many parents are relying primarily on the government – and digital platforms themselves – to provide adequate protections for young users.
For marketing and advertising stakeholders, growing regulatory pressure represents both new opportunities and new challenges, experts say.
Mapping out a joint effort
Many leaders in the social media and digital advertising space acknowledge that while developments in Utah and Florida appear riddled with issues, they ultimately represent a step in the right direction.
“This is an area where regulation will ultimately be of great help – although it will be initially unsuccessful and marred with inconsistencies and errors,” says Mike Allton, a social media influencer and head of strategic partnerships at social media management platform Agorapulse.
New bills like those in Utah and Florida, despite their respective shortcomings or oversteps, may pressure social platforms to adopt more by-design privacy and safety features to protect users – especially young ones, Allton says. “We'll continue to see attempts from state and federal lawmakers to make social media safer for children, and social networks will respond with better safeguards and features,” he predicts.
Other industry players agree with Allton’s assessment that policymakers alone won’t and can’t be the sole changemakers. “We are clearly moving into a more regulated era for the industry, and there is clearly more attention being paid to the impact of social media on young people’s wellbeing – as there should be, but it also doesn’t just fall on lawmakers. It is important for social platforms to play a proactive role in prioritizing mental health and safety,” says Zak Ringelstein, chief executive and founder of Zigazoo Kids, a TikTok-like challenger social platform backed by high-profile investors including Jimmy Kimmel, Serena Williams and influencers Charli and Dixie D’Amelio. The company recently launched a version of its platform designed specifically for users over 13, called, simply, Zigazoo.
Zigazoo Kids and Zigazoo, for their part, have implemented what Ringelstein calls a “video-thread” format, allowing only video responses – rather than image- or text-based replies – which the platform believes helps to stem trolling, online bullying and bot accounts. Plus, the platform’s algorithm is designed to “promote positivity, authenticity and the best of humankind,” rather than reward user engagement with increasingly extreme content (an issue that YouTube infamously ran into), Ringelstein says. These and other protections, he hopes, help create “a more positive, nontoxic, social media environment that feels safe for all, especially younger users.”
And while many platforms are making privacy and safety-focused changes of their own accord, some believe that the world’s biggest platforms will find relief in the fact that new regulations free them from the burden of leading the charge. “[Some] social media companies and tech companies will welcome some form of regulation so that they feel that they've got something to work with and know where the line in the sand is and what the boundaries are – and not feel that they're the ones making all the rules,” says Matt Navarra, a leading social media consultant. “Meta constantly says it feels that it shouldn't have to be the one that makes the decisions and be the arbiter of truth … so I think there'll be some support from tech companies.”
The impact for advertisers
Advertisers who rely on the ability to reach young audiences, including teens and children, will undoubtedly see new efforts to restrict kids’ access to social platforms as a barrier to growth.
“Advertisers will always be keen to have opportunities and channels to reach all of the demographics that they want to reach in that most effective way,” Navarra says. “So [when it comes to] any regulation or rules around use of social media or advertising, they're going to be particularly interested to know [about]. If it impacts their ability to reach their audiences, undoubtedly, they will not be impressed, and will be looking to find alternative routes to market.”
At this point in time, Navarra estimates that many brands who see young demographics as part of their target audience will be most concerned with potential restrictions on TikTok. And it’s a distinct possibility: a handful of governments, including those of the US and the UK have already banned the app from government devices, and there is mounting pressure in both countries to ban the platform altogether. The Biden administration earlier this month issued an ultimatum to TikTok’s Chinese parent company ByteDance, urging it to sell to an American owner or face a wholesale ban in the US.
Suggested newsletters for you
Considering that the video-sharing app in 2022 ate up 2.4% of all digital ad spend in the US – putting it on par with YouTube and ahead of Twitter – it’s an increasingly important channel in most marketers’ arsenals. It’s particularly effective for reaching young audiences, considering that the large majority of users on the platform are between the ages of 16 and 34. And while competing products like Instagram Reels and YouTube Shorts may attract ad spend in lieu of TikTok, a potential ban will certainly “be a huge loss to advertisers and brands,” Navarra says.
Nonetheless, some experts believe that legislative efforts to create new social media guardrails for young users will not only benefit consumers, but will also generate positive pressure that forces the ad industry to evolve. “Advertisers have safe options for reaching minors within the Biden administration's proposed updates to data and privacy guidelines [spelled out in COPPA], which are currently set at the young age of 13 years old and should be raised to align with other societal protections for minors,” says Jason Williams, chief executive at Kidoz, a privacy-centric mobile advertising network focused on reaching kids, teens and families. “Commerce can and will adapt. Regulation should be designed and implemented to protect society's most vulnerable.”
Plus, additional changes to the privacy landscape that have already impacted advertisers’ abilities to target their audiences – such as Apple’s AppTrackingTransparency tool, which enables users to select which apps can track them across the web – are already forcing broader industry change, Zigazoo’s Ringelstein points out. “In a post-iOS 14 world, advertisers can't rely solely on targeted performance marketing on Facebook; it's a much more complicated picture and gen Z is much more interested in engaging directly with brands they love and not just being bombarded by ads while they're trying to get to the content they want to engage with,” he says.
Ultimately, he and others suggest, new restrictions on how brands are able to reach young audiences will lead to more innovative approaches and experiences that create value for consumers – while respecting their privacy online.
For more, sign up for The Drum’s daily US newsletter here.