Data Deep Dive Social Media Data & Privacy

Regulating teen safety and privacy online is a challenging needle to thread, experts say

Author

By Kendra Barnett, Associate Editor

May 12, 2023 | 14 min read

Though pressure is mounting on lawmakers to pass more comprehensive protections for young users online, the issues of data privacy and safety can be complicated – and sometimes even at odds. This story is part of The Drum’s latest Deep Dive, The New Data & Privacy Playbook.

teen on phone

Teens and children in the US could soon see new online privacy and safety protections / Adrian Swancar

Last week, the US Federal Trade Commission announced new plans to ban Meta’s use of underage users’ data.

Just three months prior, US President Joe Biden called attention to children’s data privacy and online safety in his State of the Union address, saying, “It’s time to pass bipartisan legislation to stop big tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us.” It was a point he’d also made in his 2022 State of the Union address.

Kids’ and teens’ privacy and digital safety have become a top priority for many lawmakers. The issue was hotly debated last month when TikTok CEO Shou ZI Chew testified before Congress. US states like Florida and Utah are going to new lengths to limit teens’ access to social media. California, meanwhile – a longtime leader in privacy policy in the US – is drawing inspiration from the UK in its proposal to require digital platforms to create age-appropriate interfaces for users under 18.

Now, two US bills with bipartisan support are gaining momentum in Congress. If enacted, they would expand privacy protections for teens and force digital platforms to adopt new by-design safety features for young users.

The tangled roots of a tree of problems

The issue of children’s privacy has attracted new levels of scrutiny in light of consumers’ growing concerns – much of which may have been accelerated by the pandemic. “There started being an awareness of just how ‘online’ children were … we were going to work remotely and going to school remotely,” says Bailey Sanchez, policy counsel with the youth and education privacy team at Washington, DC-based think tank the Future of Privacy Forum.

In the early days of remote learning in 2020, parents began raising concerns about online safety and privacy in the context of edtech (and with reason – Google, for example, found itself in hot water in 2021 for collecting kids’ data via its educational products). Since then, says Sanchez, the issue has “spilled over to this broader consumer privacy conversation that’s not just about kids at school and how edtech has been deployed, but really like, ‘What is the relationship between kids and technology at home?’”

But it’s not just pandemic-related phenomena that have drawn increasing attention to children’s safety and privacy online. It’s a confluence of factors. A bombshell 2021 report by the Wall Street Journal detailing ways in which Meta concealed information about its products’ negative effects on teens’ mental health, paired with the highly publicized testimony of former Facebook employee Frances Haugen, stoked the flames. Growing scrutiny around how social media platforms navigate content moderation, teen mental wellbeing and data privacy have led companies to introduce new parental controls and screen time limits – and even promote these efforts in flashy ad campaigns. Meanwhile, regulators in Europe and the US have slapped tech giants like Meta and Google with historic fines for allegedly collecting children’s data without appropriate permission; just last month, the UK fined TikTok $15.7m for misusing kids’ information.

These and other factors have given credence to “the notion that big tech knowingly prioritizes profit at the expense of user safety and wellbeing,” says Arielle Garcia, chief privacy officer at IPG-owned media agency UM Worldwide.

And overreaches by these platforms could have a ripple effect on users’ trust in all players in the digital ecosystem, advertisers included. As such, it’s in the ad industry’s best interest to hold platforms accountable and work with regulators to create more stringent rules around how the ecosystem treats young users’ data, suggests Angel Maldonado, a privacy advocate and the chief executive officer at Empathy.co, a privacy-focused commerce and search adtech firm.

“If big tech companies like Meta continue to commercialize kids’ and teens’ data, there are worrying and far-reaching implications for the next generation of young social media users,” he says. “As [these companies fail] to demonstrate an ability and willingness to protect and respect users' private data, reversing this growing sense of consumer distrust and initiating the industry-wide change required becomes ever harder. This is why it's so crucial to call out [harmful data] practices for what they are: abusive, obscene and wrong. To create a safer online and digital world we have to make a stand against this use of private data.”

Progress toward new federal protections

As tech companies attempt to self-regulate and US states move to enact their own restrictions, pressure is mounting on members of Congress to enact legislative change on the national level.

Of course, there are different routes to regulating the range of issues concerning kids’ safety and privacy on the internet. “Some of these problems can be helped by changes to privacy practices. Others are better addressed by other design changes unrelated to how personal data is collected or shared,” explains Cobun Zweifel-Keegan, the Washington, DC managing director of the International Association of Privacy Professionals.

Though there is often overlap between the categories, it’s a dichotomy that Sanchez also recognizes in various regulatory efforts. “The safety [and design-focused] bills tend to be more focused on… mitigating harms. And the privacy bills look closer to what you see in consumer privacy… [they] really speak to how the data itself is regulated. Like, is there data minimization? Do you have the ability to opt-in or opt out of data collection?”

As far as privacy-focused efforts go, a comprehensive data privacy bill introduced in 2022, the American Data Privacy and Protection Act, gained bipartisan support in the House and had a fairly favorable outlook before it reached a standstill in committee.

But while progress on a broad-scope federal privacy bill stalls, an effort to revamp the US’ federal Children's Online Privacy Protection Act (Coppa) of 1998 is gaining traction. Senators Edward J. Markey (D-Mass.) and Bill Cassidy (R-La.), reintroduced Coppa late last year in an effort to expand its protections to include teens up to 16; it previously only applied to those 13 and under.

“Children and teenagers are going to use the internet. Parents should be confident their children are safe when doing so,” said Dr. Cassidy last week. “This bill prohibits internet companies from collecting personal information on young teenagers without consent.”

It’s a well-timed push. New data from fraud protection and privacy software firm Pixalate finds that some 54% of US apps marketed toward children in the Apple App Store appear to violate Coppa and that 21% don’t even have a privacy policy in place. In short, it’s likely that many children and teens using mobile apps in the US are having their data illegally captured and accessed by publishers, app developers and advertisers.

Meanwhile, on the design and safety-focused side of the aisle, the federal Kids Online Safety Act (Kosa) was reintroduced in the Senate just last week. In essence, the bill aims to put the burden of responsibility on social media companies to mitigate safety risks on their platforms.

Kosa would require platforms to adopt more stringent approaches to content moderation, cracking down, for example, on the dissemination of posts promoting substance abuse, eating disorders and suicide. It would also make platforms subject to annual third-party audits of the risks their products pose to underage users. In its current form, the bill has gained support from more than 30 Senators across party lines.

But the waters can be muddied when issues of user privacy and safety don’t overlap. Sometimes, the principles even appear at odds with one another.

The American Civil Liberties Union (ACLU), for example, vehemently opposes Kosa, as it says that the requirements it would put on platforms to moderate content would actually overstep young users’ privacy rights by forcing it to engage in more active surveillance. “It would ironically expose the very children it seeks to protect to increased harm and increased surveillance,” ACLU senior policy counsel Cody Venzke said in a statement to CNBC.

Efforts to stem the spread of child sexual abuse material (CSAM) online, like the proposed Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, have faced similar criticisms. Many such efforts would allow ‘back door’-type access to consumers’ devices and encrypted information, critics argue.

Despite the challenge of reaching sufficient bipartisan consensus to advance bills like Coppa 2.0 or Kosa, many privacy experts believe the momentum is there.

“I think it's fair to predict that something is going to happen on regulating teens online. Whether that is more of a safety approach or a privacy approach is harder to say. But even if Kosa does not pass, or if Coppa 2.0 does not pass, the questions that are being asked in these bills – like, ‘How do we protect kids online? And how should their data be treated online?’ – I don't think those questions are going to go away,” says the Future of Privacy Forum’s Sanchez. “So I think even if we don't see anything pass, we would certainly see something introduced or reintroduced next year.”

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Considerations for advertisers

Any new legislation on youth safety and privacy online is sure to create new limitations on the kinds of information that advertisers can use for targeting and measurement purposes.

But as thinking around restricting or banning targeted advertising to children becomes more scrutinous, the ad industry is being forced to deal with these questions in real-time. Luckily, many industry players are well on their way. “There is no question that this is the direction that regulators are headed, so most major ad platforms have already taken steps to stop the collection of youth data for advertising purposes – and stop the delivery of targeted ads to these users,” the International Association of Privacy Professionals’ Zweifel-Keegan says.

Another key focus for advertisers, per Zweifel-Keegan, should be clearly established purposes for data collection. “One of the foundational principles of privacy practice relates to purpose-based limitations on why data is collected.”

For advertisers, he says, this means that it should be “feasible to collect the data necessary from all types of users to measure media performance and other quality indicators – without also using or sharing that data for other purposes.” He suggests that organizations, and perhaps especially ad industry players, should ensure they have tools in place to manage purpose-specified data controls.

Of course, none of this guarantees that advertisers’ bottom lines won’t be affected, Zweifel-Keegan suggests. “Media performance is a separate issue.”

And of course, as the digital ecosystem grows increasingly complex, so too do the ways in which consumers, lawmakers and advertisers think about navigating users’ privacy and safety online. The boom of artificial intelligence, for example, is sure to introduce new challenges and considerations.

As UM Worldwide’s Garcia puts it: “The acceleration of AI will inevitably raise eyebrows for regulators and enforcement agencies that are keenly aware of the reality that the advertising industry routinely monetizes and benefits from inferring attributes and interests to reach audiences, but is seemingly unwilling to apply this same capability to protect their users’ data and wellbeing.” For example, she says, industry players could “leverage inferred age as a check on age-gating mechanisms and self-reporting in supporting the delivery of age-appropriate experiences and content.”

And as our digital world changes, the stakes only grow higher, Zweifel-Keegan says. “As we move to a world where large data sets and powerful algorithms are ubiquitous, embracing best practices for handling personal data will only become more and more essential.”

To read more from The Drum’s latest Deep Dive, where we’ll be demystifying data & privacy for marketers in 2023, head over to our special hub.

Data Deep Dive Social Media Data & Privacy

More from Data Deep Dive

View all

Trending

Industry insights

View all
Add your own content +