Meta Data Deep Dive Social Media

FTC plans to implement wholesale ban on Meta’s use and monetization of children’s data

Author

By Kendra Barnett, Associate Editor

May 3, 2023 | 9 min read

The regulatory agency is cracking down on Meta after the tech titan allegedly violated previous orders concerning children’s data privacy on its platforms.

kid with phone

Meta could soon be prohibited from monetizing kids' user data / Bruce Mars

The US Federal Trade Commision (FTC) said Wednesday afternoon that it has proposed a new “blanket prohibition” on Meta’s collection and use of young users’ personal data. Regulators said that the action is being taken in response to claims that Meta violated previous privacy orders from the FTC as well as the US’ Children’s Online Privacy Protection Act Rule (COPPA).

Now, the FTC aims to expand an already enormous $5bn consent order with Meta that went into effect in 2020, citing privacy failings on Meta’s part. In an announcement published on the FTC’s website, the agency said that Meta failed to “fully comply” with the order, which spelled out requirements for the company concerning how it represents its privacy practices.

The FTC accused the tech giant of having “misled parents about their ability to control with whom their children communicated through its Messenger Kids app” and said that the company was not transparent about the degree of access to user data it offers to certain app developers.

“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, director of the agency’s Bureau of Consumer Protection, in a statement. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”

The FTC’s proposed changes would disallow Meta from profiting from the user data it collects on users under the age of 18 – whether that information is gathered via its social platforms such as Instagram and Facebook or even through the company’s virtual reality products like Horizon Worlds and Oculus Quest VR headsets and experiences.

It’s a development that could have far-reaching implications for Meta’s business, which relies heavily on targeted advertising. Without access to young users’ personal information, it will be much more difficult for Meta to serve ads to these users based on their online behavior. Advertisers peddling products like virtual accessories for video game avatars will not be able to target kids with the same precision they once were on Meta’s platforms.

If the proposed change is adopted, Meta would also face additional privacy-focused limitations. For example, it would require Meta to get users’ explicit consent for future uses of its facial recognition software. The order would also prohibit Meta from rolling out or updating products or services without explicit sign-off from an independent third-party assessor who deems their privacy policies in full compliance with the FTC’s new order.

The announcement marks the third time that the FTC has taken action against Meta for alleged violations of users’ data privacy. The first was a 2011 complaint, followed by a 2012 FTC order prohibiting the tech company from misrepresenting its data privacy practices. However, the FTC found that Meta violated this order within months of its announcement – and to no small degree. According to regulators, Meta had given “misrepresentations that helped fuel the Cambridge Analytica scandal.”

The FTC then issued a second order that went into effect in 2020 and which sought to resolve the company’s alleged violations of the first order. Regulators forced Meta to pay a $5bn fine, agree to an expanded privacy program and be subjected to more frequent and far-reaching risk assessments of the company’s privacy practices.

Once again, regulators are now claiming that Meta fell short on its promises, leading to today’s proposed ban on Meta’s ability to use children’s data. An independent assessor tasked with evaluating whether or not Meta met the requirements of the 2020 order found “several gaps and weaknesses in Facebook’s privacy program,” the FTC said today. The assessor found that Meta continued to grant app developers access to user data after it had previously agreed to eliminate access in cases where users had not used the app in question in the previous 90 days.

The FTC also said it found that children in some cases were able to communicate with contacts that had not been approved by their parents in group chats and group video calls on Meta platforms – despite the fact that the tech company promised it would only allow children to only communicate with contacts approved by their parents. If the allegation is true, it may be in violation of COPPA.

Privacy advocates are largely in support of the FTC’s move to crack down on Meta’s children’s data privacy practices. “For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms,” said Josh Golin, executive director of Fairplay, a nonprofit organization focused on children's digital safety. “The FTC has rightly recognized Meta simply cannot be trusted with young people’s sensitive data and proposed a remedy in line with Meta’s long history of abuse of children. We applaud the Commission for its efforts to hold Meta accountable and for taking a huge step toward creating the safe online ecosystem every young American deserves.” (It‘s worth noting that Golin and his team last month penned an open letter in conjunction with the Center for Countering Digital Hate, urging Meta to pause plans to welcome children to its Horizon Worlds game).

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

The sentiment is echoed by other leaders in the space. “Today’s action by the FTC is a long-overdue intervention into what has become a huge national crisis … [Meta] has not done enough to address the problems caused by its unaccountable data-driven commercial platforms,” says Jeff Chester, executive director at the Center for Digital Democracy.

“Amid a continuing rise in shocking incidents of suicide, self-harm and online abuse, as well as exposés from industry whistleblowers, Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards,” Chester goes on to say. “Parents and children urgently need the government to institute protections for the digital generation before it is too late. Today’s action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens.”

Of course, for advertisers, the proposed changes may threaten their ability to connect with young audiences based on individual users’ behavior. In particular, it would hamper ad targeting and performance measurement when it comes to campaigns designed for young consumers.

It’s a challenge that developers and advertisers are increasingly facing as children’s data privacy and online safety attracts an ever-brighter spotlight. In March, TikTok CEO Shou Zi Chew testified before Congress, where lawmakers pelted him with questions about the app’s data security, children‘s safety and content moderation practices. President Biden has himself called for an end to targeted advertising to children.

For more, sign up for The Drum’s daily US newsletter here.

Meta Data Deep Dive Social Media

More from Meta

View all

Trending

Industry insights

View all
Add your own content +