Digital Transformation Online Safety TikTok

As kids’ digital safety bills advance in Congress, privacy pros & advertisers sound alarm

Author

By Kendra Barnett, Associate Editor

July 31, 2023 | 12 min read

Proposed legislation could shield children and teens from potentially dangerous online content, but may come at the price of their data privacy and access to information.

United States Capitol building in Washington, D.C. in sunlight

US lawmakers are preparing to vote on two new pieces of legislation that seek to provide protections for children & teens online / Mayer Tawfik

Two bills that propose new guardrails for children’s and teens’ safety in digital spaces are going to a vote in the US Senate. It’s a major development in young people’s online safety – an ongoing priority for the Biden administration.

However, the development holds complicated implications for privacy and online advertising, as new safety mechanisms could potentially mean a reduction in young people’s data rights and new restrictions on online content, experts say.

The Senate Commerce Committee, helmed by Maria Cantwell (D-WA), voted unanimously Thursday to advance the Kids Online Safety Act (Kosa) and amendments to the Children and Teens’ Online Privacy Protection Act – colloquially dubbed Coppa 2.0 – to a full floor vote. Both bills aim to help keep kids safe online and shield young people from the mental health risks posed by social media – a crisis that’s been well-documented even by the organizations perpetuating it.

And, many players in the digital space are lauding the progress. For one, there’s Josh Golin, executive director of Fairplay, a nonprofit organization focused on children’s digital safety. He says: “We commend … [the] Committee for advancing the Kids Online Safety Act and Children and Teens’ Online Privacy Protection Act. In the 25 years since Congress last passed meaningful online protections for children, social media platforms have honed their manipulative design techniques in order to addict children to their products and capture young people’s attention and data. As a result, countless young people are harmed online in serious and preventable ways every day.”

Together, Golin argued, the bills “will help create the internet all young people deserve – one that respects their privacy and autonomy and allows them to safely learn, play and connect.”

However, it’s this first point precisely – children’s privacy – that concerns critics of the bills the most.

“Ensuring [young people] have safe online spaces is a really important goal, especially with a lot of the studies that Congress and advocates have been citing around the impact of [certain] content on mental health. But we are concerned that this bill may do more harm than good,” says Aliya Bhatia, a policy analyst at the Center for Democracy & Technology, a nonpartisan organization working to advance civil rights and liberties in the digital age.

Concerns about Kosa’s impacts on data privacy and online liberties

A chief concern of Bhatia’s is the structure of Kosa in particular, which necessitates that children and teen users be treated differently than their adult counterparts by tech companies. Notably, Kosa aims to restrict young people’s ability to adjust their privacy settings, as online services must be able to access specific information about young users in order to protect them. These rules, Bhatia says, “will require online services to know who the younger user is [versus] the older user. And that, unfortunately, will require further data collection or data inference.”

Though an amendment to Kosa clarifies that the bill will not require online services to perform age verification, critics have said that the structure of the bill makes age verification difficult to get around. “Platforms would either need to implement age verification mechanisms that require incremental data collection – like IDs, biometrics – or would be forced to remove content generally, impacting adult users as well,” explains Arielle Garcia, chief privacy and responsibility officer at IPG-owned ad agency UM Worldwide.

In fact, the kind of data collection potentially necessitated by Kosa could include methods of data collection or inference as intrusive as facial recognition. “Unlike [how] a bartender or a cashier at a liquor store can sort of … guess based on a face what [a customer’s] age is, and narrow down who gets that second check of an ID validation, an online service can’t do that,” Bhatia says. “So that means all users – adults and children alike – will undergo some sort of age estimation interstitial. That will require either data collection of some sort, in the form of collection and proof of ID, or some sort of facial estimation biometric scanning, which is, again, a form of data collection and analysis.”

A further concern about using facial recognition software in such a way is the built-in biases that many machine learning algorithms suffer. The data sets on which facial recognition algorithms are built often lack representation across races, ethnicities and gender presentations, creating potential for factual errors and misinformation.

What’s more, the age verification methods that online services would likely be forced to adopt, critics say, would further bolster big tech’s competitive advantage in the marketplace due to their existing stores of user data. As Garcia puts it: “Some critics note that … the largest platforms would be best-positioned to action a constructive knowledge standard and age estimation, given the vast amount of user data in their possession.”

Another key concern about Kosa includes its duty of care – the obligation imposed on online platforms to protect users from certain potentially harmful content, such as content about substance abuse, self-harm, suicide and eating disorders. While the requirement, on its surface, would appear to be a positive step in protecting young people, some argue that the bill’s ambiguity in defining “harmful content” could set a dangerous precedent for far-reaching censorship or the suppression of information online. “This could in turn have a detrimental impact on youth – and adults, depending on the approach adopted by the platform – which could unduly impact vulnerable groups such as LGBTQ+ youth,” says Garcia.

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Advertisers on alert

It’s not just access to information at risk of being disrupted by Kosa’s duty of care; the advertising industry, too, could suffer collateral damage. Because the bill will be enforceable by state attorneys general – meaning that relevant online services face a risk of liability under Kosa – many platforms “are likely to be very risk averse,” says Bhatia, “and may even pursue some sort of zero-tolerance policy to preemptively avoid hosting certain types of content that they think that will get them in hot water.”

As a result, she predicts, platforms like Instagram and TikTok may have to turn to automated content filtering tools “that are notoriously blunt.” Many such tools, which are unable to determine intent, could have the consequence of not only blocking potentially sensitive or harmful content related to, say, violence or eating disorders but also content that aims to provide resources for people facing these kinds of issues. Ultimately, in an effort to comply with Kosa’s duty of care, online platforms could be inadvertently blocking positive and helpful advertising content.

“With, for example, an organization that is trying to put sponsored posts out to reach a certain group of users, like, ‘Here’s a support group for people who face eating disorders or disordered eating,’ or ‘We create resources for and by the LGBTQ community,’ [filtering] tools might be overly broad in their impact” and block this kind of content, says Bhatia. “And advertisers may not want to be hosting content alongside the content that stays up. That’s another big question mark of this bill.”

Additionally, though applicability remains somewhat vague, Kosa outlines requirements for companies to stem the “addictive” nature of their platforms and “limit features that increase, sustain, or extend” their use by young people, such as auto-playing video. These rules, says UM’s Garcia, could “result in knock-on effects to user volume, engagement and inventory cost” for advertisers. However, she admits that the impacts on advertisers are ultimately “difficult to predict” at this point considering Kosa’s breadth and ambiguity.

Finally, many critics have also argued that Kosa’s scope of applicability is too broad. In its current form, the bill could apply to services as varied as messaging platforms, video games, streaming services and email apps. Ultimately, critics worry this broad scope could “subject teens to greater monitoring that could be detrimental, while also potentially compromising security and encryption that adult users benefit from,” Garcia notes.

In essence, Kosa in particular could impact both data privacy and advertising effectiveness – effects that could potentially prove detrimental to both users and businesses.

A sign of progress for federal privacy legislation?

Despite concerns about the potentially negative impacts on privacy and digital advertising, some within the digital liberties space say that both Kosa and Coppa 2.0 represent a step in the right direction.

Cobun Zweifel-Keegan, managing director at the International Association of Privacy Professionals in Washington, DC, suggests that the advancements of these bills could spell good news for the future of federal-level privacy protections – something the US has yet to establish in any meaningful way. “We could be on the verge of seeing federal action on data privacy for the first time in years,” he said in a statement shared with The Drum. “This comes after a busy year for youth privacy and safety rules at the state level, where a widely diverging set of new laws have been passed. With some of these under legal challenge, the emerging standards for youth safety, privacy and mental health on online platforms are not yet set in stone. Federal legislators are looking to fill this gap.”

Nonetheless, Zweifel-Keegan acknowledges that “many hurdles remain.” He added: “The path to success for these bills is much broader than comprehensive consumer privacy bills.”

Bhatia, for her part, also hopes that future federal-level privacy legislation could ultimately address lawmakers’ concerns about children and teen safety more effectively than frameworks like Kosa and Coppa 2.0. “The best way to protect kids’ privacy is to protect everyone’s privacy and create a set of standards for all service providers that limit the sort of data that’s collected on all users and introduces purpose limitation provisions and minimizes data retention. This is a really important first step before a lot of other efforts.”

The sentiment is echoed by other experts, including Garcia. “While it is understandable to prioritize children and teens given their unique vulnerability, to do so absent a common baseline introduces complexity that could heighten the risk of unintended consequences and stymie alignment and progress.”

Last year, the US came closer than it has in decades to passing a sweeping federal data privacy bill in the American Data Privacy and Protection Act. However, efforts on the bill have since stalled, with lawmakers grappling over how far protections should go.

For more, sign up for The Drum’s daily newsletter here.

Digital Transformation Online Safety TikTok

More from Digital Transformation

View all

Trending

Industry insights

View all
Add your own content +