Digital Transformation Online Safety Snap

5 key takeaways from Meta, TikTok, X, Snap’s Congressional hearing on kids’ online safety

Author

By Kendra Barnett, Associate Editor

January 31, 2024 | 18 min read

The chief executive officers of five big tech companies appeared before the Senate Judiciary Committee on Wednesday in heated back-and-forths with lawmakers on how platforms are working to protect children online.

US Congress building

On Wednesday, social media executives testified before Congress on children's online safety / Harold Mendoza

Social media executives from five of the world’s biggest social media companies – Meta, TikTok, X, Snap and Discord – were grilled by lawmakers for more than four hours on Wednesday about their platforms’ policies and approaches to protecting children’s safety. The tech titans were made to answer questions on how their services combat child sexual exploitation online, protect young users’ mental wellbeing, ensure kids’ data is safe and more.

While Meta’s Mark Zuckerberg and TikTok’s Shou Zi Chew – who testified in his own Congressional hearing last March – appeared willingly before lawmakers, X’s Linda Yaccarino, Discord’s Jason Citron and Snap’s Evan Spiegel were all subpoenaed for the hearing.

The hearing is part of the Senate Judiciary Committee’s larger efforts to advance a slate of bills designed to establish new safeguards for young people’s online experiences. Specifically, five proposed pieces of legislation with unanimous support from the committee – the Stop CSAM Act, the Earn It Act, the Shield Act, the Project Safe Childhood Act and the Report Act – aim to crack down on child sexual exploitation online.

The Kids Online Safety Act (Kosa) – which is broader in scope and more divisive among policymakers than the five bills at the heart of Wednesday’s hearing – remained a topic of hot debate.

Senators Dick Durbin (D-IL) and Lindsey Graham (R-SC), who co-chair the committee, said in a joint statement ahead of the hearing: “We’ve known from the beginning that our efforts to protect children online would be met with hesitation from big tech. They finally are being forced to acknowledge their failures when it comes to protecting kids … Parents and kids demand action.”

The hearing comes as focus on children’s online safety and privacy in both the US and Europe reach new heights. Last month, the US Federal Trade Commission announced a series of proposed changes to the US’s federal children’s privacy framework, the Children’s Online Privacy Protection Rule (Coppa). Earlier last year, the Biden administration debuted a handful of executive actions designed to enhance the safety and privacy of young internet users. The EU and the UK meanwhile, have moved toward implementing age-appropriate design standards, requiring platforms to implement special guardrails for younger users.

At the start of the event, Durbin emphasized the committee’s particular concern over the sexual exploitation of children online. He cited data from the National Center for Missing and Exploited Children that indicates daily reports of children’s online sexual exploitation jumped nearly 195% between 2013 and 2023, from approximately 1,380 to 100,000.

These are the biggest highlights from Wednesday’s Congressional hearing.

Execs are hesitant to express support for proposed kids’ online safety legislation

A number of lawmakers demanded that the tech execs signal explicitly whether they support the Stop CSAM Act, the Earn It Act, the Shield Act, the Project Safe Childhood Act, the Report Act and Kosa.

Snap’s Spiegel signaled his company’s support of Kosa, saying: “We strongly support the Kids Online Safety Act and we’ve already implemented many of its core provisions.”

Yaccarino, too, indicated X’s support for the bill in response to a prompt from Senator Richard Blumenthal (D-CT). She said: “We support Kosa and we’ll continue to make sure that it accelerates and make sure to continue to offer a community for teens that are seeking that voice.”

Spiegel and Yaccarino stood in contrast to their more lukewarm peers – Meta’s Zuckerberg, Discord’s Citron and X’s Chew all dodged the question in one way or another, often trying to point out elements of the bill they supported without signaling their wholesale support. Zuckerberg, for example, said that Meta supports the “basic spirit” of Kosa, as well as some of its protections, such as age-appropriate content guidelines, but could not endorse the proposal on the whole.

Some industry stakeholders applaud Yaccarino’s and Spiegel’s willingness to endorse Kosa. Josh Golin, executive director of Fairplay, a nonprofit focused on youngsters’ digital safety, tells The Drum: “We appreciate that Snap and Twitter are being proactive by supporting the Kids Online Safety Act because that’s what will make a difference – not more empty promises and press releases and ‘tools’ that put the onus back on parents and young users. It is Kosa that would address a wide range of harms that parents are concerned about, and with 47 bipartisan Senate cosponsors, it is the bill that can and should move right away.”

Advertisers, too, Golin suggests, should support the bill “because it is on their behalf that social media platforms do anything and everything to maximize engagement, and surely advertisers don’t want to be complicit in addicting kids or sending them down deadly rabbit holes.”

Zuckerberg forced to apologize to victims’ families on the Senate floor

The families of young people harmed by social media turned up to Wednesday’s hearing in droves, holding protest signs and images of their lost loved ones.

In a heated interaction, Senator Josh Hawley (R-MO) pressured Zuckerberg to take accountability for the role that Meta platforms had played in the mental and physical harm of many of the families’ children. In particular, he drew attention to data provided by a whistleblower that indicates 37% of 13- to 15-year-old girls on Meta platforms said they’d been exposed to unwanted nudity, 24% said they’d experienced unwanted sexual advances and 17% said they’d encountered self-harm content in the last seven days.

The lawmaker asked the executive whether he or Meta had compensated the children’s families – and even pressured the billionaire to set up a victims’ fund with his own money. Zuckerberg dodged the questions and said: “Our job is to make sure that we build tools to help keep people safe.”

In an increasingly tense back-and-forth, the senator finally asked if Zuckerberg would be willing to publicly apologize to the families in attendance.

Zuckerberg stood and faced the families behind him in the hearing room. He said: “I’m sorry for everything you have all been through. No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.”

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Some in the media industry suggest that taking accountability is a critical first step. “I am not sure any of these platforms on Capitol Hill can ‘win’ until they collectively agree that their technologies represent gateway drugs in the digital era and that children’s lives are being negatively impacted under current protocols and business models,” says Seth Ulinski, an independent business analyst and consultant.

However, other experts take a more pessimistic view. “It’s lots of political grandstanding in these Senate hearings,” says Matt Navarra, a leading social media consultant and industry analyst. “Bringing Zuckerberg up and almost shaming him into apologizing publicly was a photo opportunity moment that was clearly very well orchestrated by the senator that requested it. And Senator Lindsey Graham was saying that Mark Zuckerberg has got ‘blood on his hands’ – it’s more about the senators and the quotable lines and the moment of fame in the hearings than it is about an actual meaningful push for change.”

Despite many similar hearings over the past decade or so, where big tech leaders have been forced to answer for their wrongs before lawmakers, Navarra says: “They still tend to not actually generate any significant or substantial regulation.”

Senators and execs argue over the evidence of social media’s detriment to children

Lawmakers battled with executives – and Mark Zuckerberg in particular – over the extent of evidence that suggests a link between children’s and teens’ use of social media and detriment to mental health.

Addressing Zuckerberg, Senator Jon Ossoff (D-GA) said: “It’s the overwhelming view of the public … that this platform is harmful for children.” He went on to say: “Everyone knows that kids who spend … too much time on [Meta] platforms are at risk.”

Beyond this anecdotal evidence, however, Ossoff also pointed to a 2023 report from the US Surgeon General that found that children who spend more than three hours a day on social media have double the risk of poor mental health outcomes, including depression and anxiety.

Zuckerberg pushed back against the assertion, saying, “It’s important to characterize it correctly. I think what he was flagging in the report is that there seems to be a correlation,” indicating that causation has not been established.

In his opening statement, the Meta exec attempted to draw a distinction between independent studies and a holistic, causal view of the problem. “Mental health is a complex issue and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes,” he said. He pointed to a 2023 project from the National Academies of Science, which evaluated more than 300 unique studies on the issue and found that its research ‘did not support the conclusion that social media causes changes in adolescent mental health at the population level.’

Other lawmakers chose to emphasize the anecdotal evidence at hand. “Zuckerberg, your testimony referenced the National Academies of Science study that said, at the population level, there is no proof about harm for mental health,” said Senator Chris Coons (D-DE). “Well, it may not be at the population level, but I’m looking at a roomful of hundreds of parents who have lost children and our challenge is to take the data and to make good decisions about protecting families and children from harm.”

Coons went on to press Zuckerberg about whether Meta reports the total amount of content that violates its content policies about suicide and self-harm, as well as the total number of views on such content. The tech exec ultimately said that the platform does not, as it chooses to focus instead on prevalence, which takes a snapshot of the problem as a percentage of total content.

Though it was not referenced explicitly in the hearing, a bombshell report from the Wall Street Journal in 2021 revealed that Meta’s own internal research has determined that Instagram use is linked to poorer mental health, particularly among teen girls – but that the company obfuscated this evidence from public disclosure.

Section 230 rears its head again

A key focus of Wednesday’s hearing was Section 230 of the 1934 Communications Act, which provides limited federal immunity to digital platforms that publish host user-generated content. In essence, Section 230 differentiates ‘publishers’ from ‘distributors.’ It’s a line that’s become increasingly blurry in the aftermath of the 2016 US presidential election when lawmakers sought to hold social media companies at least partially accountable for Russian election interference.

Senator Sheldon Whitehouse (D-RI) said that he sees Section 230 as a stumbling block for holding social media platforms accountable for the harm they enable. “As a collective, your platforms really suck at policing themselves,” he said. “We hear about … fentanyl and other drug dealing facilitated across platforms. We see it and hear about it here in Congress with harassment and bullying that takes place across your platforms. We see it and hear about it here in Congress with respect to child pornography, sexploitation and blackmail. And we are sick of it. It seems to me that there is a problem with accountability because these conditions continue to persist. In my view, Section 230, which provides immunity from lawsuits, is a very significant part of that problem.”

It’s a point that was echoed by Senator Peter Welch (D-VT). “It’s an astonishing benefit that your industry has that no other industry has. [You] just don’t have to worry about being held accountable in court if [you’re] negligent.”

Section 230 was a hot-button topic at Shou Zi Chew’s 2023 Congressional hearing, too, where the TikTok CEO said that “230 has been very important for freedom of expression on the internet” and argued that “it’s important to preserve that.” Nonetheless, he sought to assuage concerns that TikTok could be transforming into more of a ‘publisher’ than a ‘distributor’ by emphasizing TikTok’s commitment to both free speech and safety. When pressed on the issue, Chew said that Section 230 is “very complex.”

Device makers Apple and Google notably absent

Though the tech executives largely admitted that their platforms must play a role in mitigating harm to teens and children online, they also pointed out that device manufacturers have a hand in ensuring safety.

In a response to a line of inquiry from Senator Klobuchar, Zuckerberg voiced his belief that it would be more sensible for device makers to help restrict the kinds of apps that children can access than for platforms to restrict the kinds of content they consume.

“I don’t think that parents should have to upload an ID for proof that they’re the parent of a child for every single app that their children use. I think the right place to do this – and a place where it’d be actually very easy for it to work – is within the app stores themselves,” he said. “My understanding is … at least Apple already requires parental consent when a child does a payment within an app, so it should be pretty trivial to pass a law that requires them to make it so that parents have control anytime a child downloads an app and offers consent of that. The research that we’ve done shows that the vast majority of parents want that.”

It’s an argument that some social media analysts agree with. “That was a fairly reasonable and sensible suggestion that actually could make a significant difference or improve the safety of platforms among younger users,” says Navarra. While such a measure wouldn’t eliminate all responsibility for the platforms, he suggests, it would provide “a workable solution, and one that goes beyond just social media apps.”

Plus, he adds: “I’m sure Mark Zuckerberg probably took a lot of pleasure out of being able to put that suggestion to senators and leave that in their minds, given [Meta’s] history of infighting with the Apple bosses.” (In particular, Navarra is nodding to Apple’s 2021 data privacy-focused operating system changes that cost Meta an estimated $10bn in revenue).

The notion that device makers have something to answer for when it comes to children’s online safety is underscored by others, too. “While lawmakers opened by highlighting the role of smartphones in the issue, the makers of those devices were notably absent,” says Jasmine Enberg, a principal analyst at Insider Intelligence specializing in social media. “That includes Apple and Google, the parent company of YouTube, which has the widest teen audience in the US. Yaccarino was quick to call out their absence in her opening remarks, while Meta CEO Mark Zuckerberg nodded to it as he put the onus on app stores to play a bigger role in age verification.”

Change on the horizon?

Ultimately, it’s likely that social media platforms, device makers and lawmakers will need to work collaboratively to enact changes from all angles that enhance young people’s safety and privacy online.

However, some experts are wary that Wednesday’s hearing or others like it will do much to move the needle.

“While the hearing provided fireworks, I do not see it resulting in any meaningful children’s online safety bills being passed by Congress any time soon,” says Allison Fitzpatrick, partner at New York-based law firm Davis+Gilbert. Fitzpatrick works in the firm’s advertising practice.

“This is not the first time we have seen bipartisan grandstanding, outrage and soundbites during an election year,” she says. ”Since the last hearing, Congress has made no progress on children’s online privacy and safety laws, such as Children’s and Teens Online Privacy Protection Act (Coppa 2.0) and Kosa. This is allegedly because of infighting in Congress, lobbyist efforts and the power of big tech. Why would that change now when Congress seems more broken than ever?”

Insider Intelligence’s Enberg is more optimistic. “The stakes are higher than ever, and so is the momentum for action,” she says. “Most people believe that the platforms and government share the responsibility for protecting social media users, and there were some signs of progress [today].”

Nonetheless, change won’t happen overnight, she says. “Any progress will be limited, slow to enact and will require broader industry support.”

For more, sign up for The Drum’s daily newsletter here.

Digital Transformation Online Safety Snap

More from Digital Transformation

View all

Trending

Industry insights

View all
Add your own content +