Digital Transformation Google Data & Privacy

Will tech companies or regulators have the final say in our privacy debate?

Author

By Kendra Barnett, Associate Editor

November 16, 2021 | 14 min read

As part of The Drum’s Data Deep Dive, we ask whether regulators and tech companies could ever work hand-in-hand on data privacy.

Illustration of man looking at fingerprint

Can regulators and tech players work together to meet the needs of both businesses and consumers?

The conversation surrounding consumer data privacy is heating up.

Recent data from KPMG indicates that 86% of US consumers feel a growing concern about data privacy, with 78% expressing worry regarding the amount of information being collected. 40% of survey respondents said they don’t trust companies to use their personal data ethically.

With rising consumer concern has come mounting pressure on policymakers around the world. Privacy think tanks and consumer interest groups are demanding new, more comprehensive regulatory frameworks that give consumers greater transparency into and control over how their data is used.

And regulators are responding. This summer, China passed its GDPR-style privacy legislation; on its tail came Saudi Arabia’s consent-focused Personal Data Protection Law, which will take effect in March. In the meantime, more than 25 US states have introduced privacy bills that are currently in some stage of review. Earlier this summer, Colorado became the latest state to sign one such bill into law, following in the footsteps of California and Virginia before it.

While action is accelerating on the regulatory front, proposed bills often stagnate in committees, and flounder and burn out somewhere in the bureaucratic echelons of policymaking. And, as in countless other sectors, when legislators drag their feet, private industry takes on the mantle of decision-making. In the debate about consumer data privacy, this paradigm is especially true. Google and Apple in particular are leading the charge with new privacy-centric tools that limit advertisers’ and developers’ purview into consumer data and offer users greater control.

But is this the right approach? Should big tech be laying the ground rules for our privacy landscape – especially considering the far-reaching control these companies already have over businesses and consumer data? Is it possible for regulators and tech companies to work in concert to give consumers greater say over how their information is used – while also supporting the needs of businesses?

All eyes on Google

As lawmakers hash out privacy regulatory frameworks, big tech moves the needle on privacy.

Google is preparing to follow in the footsteps of Apple, Microsoft and Mozilla and will eliminate third-party cookies in 2023, a move that will offer users greater privacy but complicate advertising operations significantly by limiting advertisers’ ability to collect, exchange and use user data, target ads to specific audiences and effectively measure the impact of their efforts.

In lieu of the third-party cookie, the company has proposed a new framework — its Privacy Sandbox, which includes the hotly-debated Federated Learning of Cohorts (FLoC), its answer to the cookie’s demise. FLoC groups users into ‘cohorts’ based on similar interests and behaviors, hampering marketers’ abilities to get a granular look at audience data.

“A sandbox in engineering terms is a ‘protected environment.’ So, the fundamental principle of the Privacy Sandbox is that a browser can create a protected space around the personal data you share with the websites you visit,” says David Temkin, Google’s director of product management, ads privacy and trust. “This data is then safeguarded from being accessed in a way that can, over time, identify you. The Privacy Sandbox proposes using the latest privacy techniques, such as differential privacy, k-anonymity and on-device processing, to protect privacy online, while maintaining open access to information for everyone, so that the web can continue to support economic growth.” Temkin says Google sees this approach as a more privacy-safe alternative to fingerprinting or user-level identifiers such as hashed emails, stressing that the company doesn’t think such solutions will “meet the rising consumer expectations for privacy [or] stand up to rapidly-evolving regulatory restrictions.”

While it’s been billed as a more privacy-safe solution for advertisers, many privacy advocates are not happy with Google’s proposal, arguing that it offers little more in the way of privacy protection than cookies or hashed emails.

Other experts say it’s not such a bad idea. Marci Rozen, data security attorney and legal director at DC-based firm ZwillGen, says that, at least on the surface, something like FLoC will look more appealing to regulators than cookie-less but identity-linked solutions, many of which have emerged in recent months. “Regulators are more likely to favorably view solutions that allow targeting of larger groups of people, rather than cookie-less solutions that allow targeting of individuals,” she says. “At the bottom, regulators and lawmakers are concerned about companies’ ability to track users across sites, over time – no matter how that is accomplished.”

Regulatory groups including France’s independent data privacy administrative regulatory body Commission nationale de l’informatique et des libertés (CNIL) have expressed some concern over the application of group-based frameworks like FLoC. CNIL stresses that evaluating the viability of such solutions – and the risks they may pose to users’ rights – demands analysis of the ways in which these frameworks might still employ targeting in order to limit the monitoring of user-level activity across the web. The commission emphasizes the importance of Google’s continued collaboration with lawmakers in order to come to strong, long-term solutions – something that Google’s Temkin says the company is doing.

As an example, Temkin points out that the company “welcomed the opportunity to engage with a regulator with the mandate to protect consumers” when the UK’s Competition and Markets Authority launched a formal investigation of the Privacy Sandbox in January. Temkin says that Google’s collaboration demonstrates the company’s commitment to advancing user privacy and shows that it is “open to making these changes to our products in a way that ensures more choice and competition.”

Where Apple and Meta come into the picture

In the meantime, Apple has cracked down on user privacy with a slew of new operating system updates. Chief among them is AppTrackingTransparency (ATT), which gives users the choice of whether or not to allow apps to track their behavior across other apps and the open web. More recently, with the rollout of iOS 15, Apple debuted a handful of new features including built-in VPN access via the new Private Relay, privacy ‘report cards’ that show users how various apps use their data and improved email privacy.

These changes have not only inhibited app development, ad targeting and ad measurement – especially since a majority of Apple users have opted out of tracking via ATT – but have helped to further line the pockets of the tech giant. In fact, new data indicates that the company’s crackdown on privacy will help it reel in an additional billion dollars annually.

Meta, for its part, is trialing new tools that aim to allow advertisers to serve targeted content in a privacy-safe way. Earlier this month, the company also announced it is shutting down its facial recognition software across its properties.

The question of dominion

As tech companies press forward, opinions regarding the role of such companies in shaping data privacy policies and norms are mixed.

“Many of the actions the platforms are taking are a step in the right direction for privacy, providing people with greater transparency and choice,” says Arielle Garcia, chief privacy officer at UM Worldwide. “However, where big tech is in the driver’s seat, there will of course be a tendency for the platforms to steer toward solutions and standards that support their own commercial interests. Unilateral changes by big tech foster myopic outcomes that create further fragmentation and complexity for publishers, developers and advertisers, all the while creating a more confusing preference experience for people.”

Rozen is in agreement that many of the changes introduced by Google, Apple and Meta have likely been made, at least in part, to support their own long-term viability. “Broadly speaking, tech companies have an interest in showing consumers and regulators that they take privacy seriously,” she says. “Consumer concerns and regulator interests are linked – consumer concern will drive complaints to regulators and lawmakers, which in turn can drive enforcement actions and legislative attention – if not legislation itself.” She says that companies understand that privacy shortcomings or missteps will generate consumer backlash, regulatory inquiries and possibly even public hearings on Capitol Hill. “So, it’s not necessarily the case that tech companies are trying to be trailblazers on privacy issues,” Rozen says. “Rather, they are trying to avoid being behind the eight ball on inevitable privacy legislation – or worse, having their actions be the thing that spurs privacy legislation.”

Garcia, however, admits that there is good that can come from tech players setting privacy norms. “Ultimately, the net impact of these changes will be positive, serving as an opportunity and a catalyst for the industry to re-evaluate and recalibrate data collection and use it in a way that is more trustworthy and fair to people.”

She stresses, however, that in order to achieve solutions that work for both internet users and businesses, real collaboration is key. She points to Apple’s approach in shaping ATT as an example of poor collaboration, saying that the framework has given users more say over how their data is used and whether or not their activity is tracked, but has largely left advertisers, developers and adtech players to fend for themselves. “[They are] left to parse Apple’s Developer FAQs in the context of their own operational ability to reconcile user preferences – for example, where an individual has opted out of tracking on their iOS device but provided permission elsewhere to use their data for personalization or measurement.”

Experts agree that to truly achieve a balance of business and consumer interests, tech players need to work hand-in-hand with all kinds of stakeholders – from consumers and consumer interest groups to regulators, advertisers, developers and publishers.

So, how will regulation shape up?

The wave of changes ushered in by leading tech companies will undoubtedly inform the direction of lawmaking moving forward. And it’s apparent that collaboration is required. But what does that look like in practice? And where is the future of privacy legislation headed?

For one, it’s important to note that the age-old push-and-pull between regulations and free markets has largely fizzled out when it comes to consumer privacy. “Big tech is largely past the point of wanting lawmakers to butt out and have regulation-free control over user privacy,” says Rozen. “Instead, the vast majority want rules – they just want the same rules for all players in the tech industry on a national level, instead of the current patchwork of state laws.”

There is currently a large appetite for federal legislation in the US and in other countries around the world. However, privacy groups largely believe that, at least in the US, comprehensive laws like the California Consumer Privacy Act (CCPA) would never see a real chance of passing in Congress, and therefore often argue that more aggressive, user-centric privacy policy must be undertaken on the local and state levels first.

It’s apparent that Congress is not yet close to reaching an agreement on privacy, and some experts believe that comprehensive GDPR-like privacy regulation is not possible in the US. However, data privacy has proven to galvanize policymakers on both sides of the aisle. In May of this year, Senators Amy Klobuchar (D-Minn) and John Kennedy (R-La) introduced the Social Media Privacy Protection and Consumer Rights Act of 2021. The bill, if adopted, would require greater transparency from online platforms, help consumers understand data breaches and crack down on companies’ compliance with various consumer privacy laws.

Despite her insistence that “privacy issues do not fall neatly along partisan lines,” Rozen admits that getting a comprehensive privacy bill through Congress would be a tall order. “In the current polarized environment, it is difficult to get almost anything passed, let alone on an issue like privacy that doesn’t seem as urgent to lawmakers when compared to things like raising the debt ceiling and pandemic relief measures.”

Congress isn’t the only forum through which to catalyze change, however. Under Lina Khan’s Federal Trade Commission (FTC) – and with privacy advocate Alvaro Bedoya’s recent appointment to the FTC by President Biden – privacy enforcement could take a new shape. Garcia notes that in its current form, the agency could help enhance children’s privacy protections and other privacy-related issues with bipartisan support.

And there is hope yet for the 25-plus state-level bills in various stages of debate across the US.

In regard to these bills, a central point of debate is the ability of such laws to provide consumers with adequate privacy protections while also ensuring that businesses – and especially small businesses – are both able to comply with various regulations and also have access to the resources they need to achieve growth.

Rozen says it will be impossible to create regulations that please 100% of consumers and 100% of businesses. And UM Worldwide’s Garcia agrees. “The biggest challenge for small businesses is that of the ‘splinternet’ – the need to manage a patchwork of state laws and international frameworks, alongside big tech-driven changes,” she says. “The dynamic and fragmented landscape creates a tremendous and costly burden for small businesses, which could to some extent be eased by federal comprehensive privacy legislation. In the interim, small businesses are flying while building the plane to an ever-changing destination, instead of being able to focus on long-term strategy and infrastructure to support it.”

Other experts believe the premise of the question itself is flawed. “The notion that ‘especially small businesses’ have demands that must be balanced against ‘consumer privacy’ is a corporate talking point, and, as such, I don’t accept the premise of [the debate],” says Ari Ezra Waldman, professor of law and computer science and faculty director at the Center for Law, Information and Creativity at Northeastern University. “On what basis are these things in tension? Is it on the presumption that massive data collection is essential for profit? There is absolutely no data confirming that’s the case. Industry wants us to think that innovation stems from mass surveillance, but it’s simply not the case. And even if it were, the public still has the right to govern data collection to achieve the kind of innovation that achieves social welfare.”

Despite the fragmented state of data privacy legislation in the US and across the globe, we are moving gradually closer to a more unified future. As state-level bills are being built upon the frameworks laid out by CCPA and the Virginia Consumer Data Protection Act, patterns of standardization are emerging. More proposals, for instance, are being structured to include bans against ‘dark patterns,’ or manipulative user interfaces that trick users into sharing personal information.

With the Googles and Apples of the world expressing interest in collaborating with lawmakers and interest groups to create a genuinely safer, more privacy-focused internet, societies are inching toward a more discernible, uniform privacy landscape.

For more, sign up for The Drum’s daily US newsletter here.

Digital Transformation Google Data & Privacy

More from Digital Transformation

View all

Trending

Industry insights

View all
Add your own content +