Digital Transformation TikTok Social Media

Don't blame TikTok for social media's ills

By Amy Luca, Executive vice-president, head of social

April 3, 2023 | 8 min read

The criticism being lobbed at TikTok by regulators and consumers evidences larger issues with data privacy, user controls and transparency that the entire social media ecosystem should work to address, writes Media.Monks' Amy Luca.

Social media collage of face and mobile phone screen

/ Adobe Stock

It feels like déjà vu with calls within the US government for TikTok to be banned once again.

The chief concern is espionage: whether TikTok can feed the psychographic data of millions of Americans (or citizens of any other country where TikTok operates) to the Chinese government, or use its algorithm to manipulate users.

But state interest in social media data isn’t unique to China, nor does it require a platform’s active participation or endorsement to pose a threat. News recently revealed that Clearview AI, a facial recognition database used by American police departments, scraped 30bn photos from Facebook to power its technology.

It’s just another example, as pointed out by TikTokers already, that the questions lobbed at TikTok aren’t unique to the platform at all. And while it's true that social media in all its forms pose a risk of exacerbating society’s ills, I’m optimistic that it has the power to better society – and that should be the conversation we’re having. So, what can we do?

I’ve had the privilege of working with major and niche social platforms with a team of over 1,200 across the globe. In our experience, we’ve recognized three key themes that are consistent no matter the market: the need for stronger data protection; algorithmic control; and the realization of platforms’ responsibilities to society. Each of these themes serve a blueprint for how platforms can be more socially positive.

We need stronger data security

We’ve reached a tipping point where consumers are more concerned with what data platforms collect from them and how it’s used. Put simply, it’s a black box. While social networks and mobile operating systems have taken steps to improve transparency with opt-in buttons, consumers often don’t understand the terms that they’re accepting.

It’s worth noting that we have never had much control over our data: businesses have historically sold personally identifiable info to the highest bidder. But social media has amplified these concerns, because the content it hosts is so much more targeted and influential in shaping how we experience the world. Every social platform must be more transparent not only about data collected, but also about how that data influences users’ experiences on the app.

Users need more control over their feeds

Much like data collection, people don’t have a solid understanding of how algorithms work or why they’re seeing a certain piece of content in a feed.

Maintaining algorithmic secrecy is important to platforms because it discourages bad actors from gaming the system and exploiting users. But platforms can preserve the proprietary nature of their feeds while being transparent with users about how they’re matched with content.

Meta does a great job of offering transparency through its Ad Library, which provides information on who has funded an ad, an estimate of how much they’ve spent as well as its demographic reach. In-feed, users can learn why an ad has been served to them and can pause promoted posts from specific brands or accounts – which is a win for brands too, because the risk of targeting uninterested users plummets and relevancy remains high.

Platforms can go even further to provide users control of their feeds. This means both opting into interests and opting out of content that's not of interest – something that users try to do already based on their own rudimentary understanding of how algorithms work.

TikTok’s recent decision to allow users to reset the algorithm of their ‘For You’ page is a step in the right direction, as are its robust parental controls to limit screen time, mute notifications and more. Ultimately, it boils down to offering a greater level of transparency and tools for users to better curate their feeds and choose the content they engage with.

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Platforms need to be more responsible about the content they serve

Social media is often characterized as the modern ‘town square,’ a sentiment underscoring the role they play in free expression. This assessment is a double-edged sword, and many have spoken already about the propensity for harmful content to be served to children or reinforce negative biases. But in the same way that algorithms can generate fear and spread misinformation by incentivizing the spread of harmful content, platforms can swing the other way to prioritize content that promotes positivity, self-esteem and education.

TikTok has made moves here by recently announcing a feed dedicated to STEM content. Content on this feed passes additional vetting from Common Sense Media and Poynter to ensure it’s appropriate and reliable. It’s a great move toward curating a more child-friendly, positive space for children, considering that 40% of young people use the app as an alternative to Google Search, per data from Google itself.

Let’s deliver on the promise of social

As an app, TikTok is unique: because the experience is built around a global community instead of users’ close circles of friends, it has enabled profound connections between people around the world. Realizing this, gen Z and gen alpha have flocked to this space to share their experiences, learn and connect to change the world for the better.

But TikTok is not unique in some of the less savory aspects of social media, and I’m optimistic that the industry will do better. Let’s not be distracted by politics or single out one platform as the scapegoat. Let’s have the tougher conversation about how social media as a whole can better serve society.

Amy Luca is executive vice-president, head of social at Media.Monks.

Digital Transformation TikTok Social Media

More from Digital Transformation

View all

Trending

Industry insights

View all
Add your own content +