TikTok CEO’s congressional testimony: 5 top takeaways and implications for advertisers
The chief executive of the popular video-sharing app faced intense finger-pointing and a deluge of questions from lawmakers on issues of data security, children‘s safety and content moderation.
TikTok’s CEO Shou Zi Chew made his first congressional appearance Thursday morning / Adobe Stock
TikTok CEO Shou Zi Chew appeared today before the House Committee for Energy and Commerce in his first-ever congressional testimony.
The hearing comes a week after the Biden administration threatened to ban TikTok in the US if the app’s Chinese parent company ByteDance doesn’t sell the platform. Amid widespread media coverage ahead of today’s session, Chew himself took to TikTok earlier this week to post a video urging users to support the app.
In an intense hearing that spanned over five hours, the executive was grilled by lawmakers on both sides of the aisle about children’s safety, user data protection and privacy, Chinese surveillance, content moderation, misinformation and more. The wide-ranging set of issues raised – and Chew’s responses – hold significant implications not only for US consumers but also for US advertisers, who are expected to pour more than $11bn in ad spend into TikTok by 2024, per data from eMarketer.
“The hearing [was] a mix of everything we’ve seen over the past several years [when it comes to lawmakers’ concerns about] TikTok, from genuine national security concerns to misguided concerns to bizarre technology questions to China-bashing to policy concerns about data privacy abuses,” says Justin Sherman, a senior fellow at Duke’s Sanford School of Public Policy and the chief exec of Global Cyber Strategies, a Washington, DC-based research and advisory firm. “It’s a mishmash of congressional attention to big tech and congressional attention to China and TikTok, playing out on national television.”
Here are the top takeaways from today’s hearing.
Chew outlines four commitments that TikTok plans to make
In his opening remarks, Chew emphasized the work the company has undertaken to protect US user data and mitigate concerns about Chinese surveillance.
He also took the time to spell out four “commitments” that TikTok will make. These include: prioritizing safety – especially for children and teens – on the app; firewalling US user data from foreign access; ensuring TikTok is a safe place for free speech that won’t be tampered with by foreign governments; and remaining transparent about its practices with the help of independent, third-party monitors.
“Trust is about actions we take. We have to earn the trust with decisions we make for our company and our products,” Chew said.
Lawmakers home in on national security and connection to the Chinese Communist Party
A primary focus for Committee members today was TikTok’s relationship with parent company ByteDance and the Chinese government – and the potential national security implications therein.
Chew, for his part, sought to reassure lawmakers of TikTok’s protection from Chinese governmental influence. He repeated that TikTok is a privately-held company and that it is headquartered in Los Angeles and Singapore, unlike its sister company Douyin in China, though it shares some common tech infrastructure.
To policy experts, what played out in the House today was largely predictable. “Chew’s statements today are right on the script for those who have been watching TikTok’s public relations efforts over the past several months,” says Sherman.
For Chew, Sherman contends, “the point is not to argue that national security concerns exist but to say, ‘Okay, we understand you have concerns, so let’s talk about how we can address them.’’
The executive also pushed back against categorizations of TikTok employees accessing specific US user data as “surveillance.” In particular, a handful of lawmakers pointed to a Forbes report from December which claimed that ByteDance employees tracked information – including IP addresses and user data – on some US journalists. Chew argued that this activity should not be classified as “spying” or “surveillance” because it was part of an internal investigation of employee operations.
As it stands, concerns about TikTok’s user data security are largely speculative; the US has not publicized any evidence that TikTok or ByteDance are sharing US user data with the Chinese government.
Nonetheless, Committee members today made national security a key focus of their remarks and questions for Chew. And based on today’s hearing, Chew was likely unsuccessful in changing many lawmakers’ minds. “The issue is that for some politicians and policymakers, TikTok cannot address their concerns. Some believe TikTok to be fundamentally vulnerable to Chinese government influence via parent company ByteDance, the Chinese tech giant. Technical changes, for those people, are not going to matter in this view,” says Sherman.
US data privacy and protection spotlighted
Concerns about data privacy, protection and governance were also emphasized during today's hearing.
Lawmakers demanded assurances from Chew that US user data is safe, private and not be shared with the Chinese government or sold to third parties.
Though Chew was elusive at times (when Rep. Frank Pallone [D-NJ], pressed Chew on whether TikTok makes revenue through the sales of user data, Chew said the company doesn’t sell user data to “any data broker” but said he’d get back to the Committee on whether TikTok sells user data to other parties), his narrative focused primarily on the commitments TikTok has made through its so-called ‘Project Texas’ - the company’s $1.5bn plan to bolster its trustworthiness and public image.
‘Project Texas’ will see all US user data moved to Oracle servers within the US and managed by US-based leadership. Chew said TikTok hopes to have the program stood up by the end of 2023.
When pressed, Chew admitted that TikTok user data may currently be accessible to ByteDance employees in China, but that, once ‘Project Texas’ reaches completion, all US user data will be under US control. The assurance did not appear to assuage lawmakers’ concerns.
A handful of other Committee members urged their colleagues to prioritize the establishment of a comprehensive federal privacy bill. It’s a goal that Congress came closer to achieving last year than it has in decades, with the American Data Privacy and Protection Act (ADPPA). Despite garnering initial bipartisan support, lawmakers became divided on the specifics of the bill – which has slowed its momentum and the likelihood that it will advance in its current form.
Now, questions from both Republicans and Democrats on TikTok’s data privacy practices seem to have invigorated a new passion for the possibility of a sweeping privacy bill. “Congressional leaders seem to agree that a single privacy standard would make a difference,” says Cobun Zweifel-Keegan, the Washington, DC managing director of the International Association of Privacy Professionals. “A uniform national standard for privacy would ensure that TikTok and all social media companies embrace the same minimum privacy best practices for all Americans. Congress knows that until it has a legal standard to point to, it will have to take companies at their word for just how privacy-protective they are.”
In fact, Zweifel-Keegan says, “After the showing of support for privacy at the hearing, it will not be surprising if we see an updated public version of ADPPA in the next month.”
For users, a federal privacy law would give them more control over how their data is collected and used by platforms like TikTok. For advertisers, however – who are the backbone of most social media platforms’ business models (including TikTok’s) – such legislation would create new hurdles to reaching target audiences through data-based targeting.
Safety – especially for young users – is raised as a top priority
Another key theme of today’s hearing was the safety and health of users – particularly young users. It’s a salient issue considering that some 75% of American teens use the app.
Many Committee members pointed to evidence that social media platforms including TikTok negatively impact children’s and teens’ mental health, driving up rates of depression, anxiety and eating disorders, and exposing them to potentially harmful content. They contended that TikTok’s recommendation algorithm is designed to be addictive to young people, potentially exacerbating the negative effects of exposure to such content.
Chew repeatedly affirmed that children’s safety is a top priority for the platform, and said that the app has rules against promoting eating disorders, self-harm and other dangerous behavior. He pointed out that if users search for hashtags that explicitly reference, for example, drug use, suicide or eating disorders, the app will redirect them to resources that can help them to reach out for help. TikTok has also begun filtering out videos with what it deems “complex or mature themes” for users under the age of 18.
Users under the age of 16, Chew said, are by default not allowed to send or receive direct messages, and there is a special version of TikTok designed for US users under 13 that is free of ads and includes only what the company sees as appropriate content. Plus, as an effort to make TikTok “a place where teenagers can come to learn,” Chew said, the platform recently debuted a special feed featuring only educational Stem videos, which has already generated more than 116bn views.
Chew also touted the platform’s newly-introduced screen limit defaults and updated parental controls. Under-18 users will be met with an automatic 60-minute daily limit on TikTok screen time; however, this limit is easily overridden with a passcode confirmation.
Some lawmakers took the opportunity to point out that Chew himself has admitted in interviews that he doesn’t allow his young children to use TikTok. The executive retorted by saying that his children live in Singapore, where the special version of the app designed for users under 13 is not available; he said that if his children lived in the US, he’d gladly let them use the app.
Content moderation and misinformation come under scrutiny
The adjacent issue of content moderation – especially as it concerns user safety as well as misinformation – was under the microscope in today’s congressional hearing.
In one of the biggest bombshells of today’s hearing, Rep. Kat Cammack (R-FL) showed a TikTok video depicting a firing gun with a text overlay reading: “Me asf at the House Energy and Commerce Committee on 03/23/2023.” Cammack said the video was posted to TikTok 41 days ago, and though it appeared to include content in clear violation of the platform’s policies, it had not been taken down. Although the video appeared to reference today’s hearing, it was posted before the hearing was publicly announced.
Addressing Chew, Cammack said: “This video … is a direct threat to the chairwoman of this Committee, the people in this room, and yet it still remains on the platform. And you expect us to believe that you are capable of maintaining the data security, privacy and security of 150 million Americans [when] you can’t even protect the people in this room?”
The video was reportedly wiped from TikTok during the hearing today.
Beyond lawmakers’ safety concerns, other issues concerning content moderation were raised by Committee members. Some touched on the threats of health misinformation and others pressed Chew on whether TikTok or ByteDance have censored content about the Chinese persecution of the Uyghurs, a minority Muslim ethnic group living primarily in the north of China. Chew denied that TikTok censors specific kinds of content.
In the latter half of today’s hearing, some lawmakers also sought Chew’s perspective on Section 230 of the Communications Act of 1934, enacted as part of the Communications Decency Act of 1996. Section 230 generally provides liability to digital platforms that publish information from third-party users. In essence, it delineates between ‘publishers’ and ‘distributors’ – a line that’s become increasingly blurry in the wake of the 2016 US presidential election, when lawmakers sought to hold social media companies at least partially accountable for Russian election interference.
Chew said that “230 has been very important for freedom of expression on the internet,” and argued, “it’s important to preserve that.” Still, he sought to quell concerns that TikTok could be transforming into more of a ‘publisher’ than a ‘distributor’ by emphasizing TikTok’s commitment to both free speech and safety. When pressed, Chew said that Section 230 is “very complex.”
Some weren’t convinced by Chew’s approach today, and say that his failure to provide more specific commitments to data privacy and content moderation could repel both lawmakers and advertisers. “By trying to avoid reveals he is being incredibly obtuse and not giving the clarity on data, moderation and safety that TikTok desperately needs to convince the US on,” says Michael Goldstein, head of communications strategy at Omnicom-owned global ad agency DDB. “For advertisers, these vague answers won’t help them feel safe committing dollars long-term either.”
Others, however, believe the platform, at least for the time being, will maintain its appeal among advertisers. “We continue to advise brands to invest in the platform – and they are,” says Permele Doyle, president and co-founder of influencer agency Billion Dollar Boy. “Brand investment continues to grow and user numbers continue to grow despite the threat of a ban. Today’s hearing will do little to turn that tide.”
For many advertisers, she says, the returns are too good to turn down. “Who can blame the brands? CPMs [cost per 1,000 impressions] on the platform are incredibly competitive – almost half that of Instagram Reels, a third less than Twitter and 62% less than Snapchat.”
Ultimately, today’s events shone a spotlight on the web of challenging safety, privacy and governance issues facing TikTok. “The wide spectrum of questions at the TikTok hearing showcased just how complex policymaking has become around digital platforms,” said Zweifel-Keegan.
Like others, Zweifel-Keegan is hopeful that it could inspire greater cross-aisle collaboration on these issues in the US. “Even though we heard testimony on the entire grab-bag of tech policy issues, committee leaders were united in focusing attention on the scarce bipartisan proposals that could make a direct impact on digital business practices. The appetite for any bipartisan solution explains the renewed focus throughout the hearing on privacy and security issues, especially from high-ranking members of the committee.”
For more, sign up for The Drum’s daily US newsletter here.