Google’s proposed method for tracking and targeting consumers without third-party cookies is being met with a growing chorus of dissent. Within the past month, a who’s who of tech players – including DuckDuckGo, GitHub and Mozilla Firefox – have vowed to block Google’s FLoC API. Here’s what it means for marketers who are searching for answers.
DuckDuckGo has long been a staple of the paranoid and the privacy-obsessed. The search engine enables users to surpass the personalized search results filter employed by most major search engines. So the fact that DuckDuckGo added a tool to its Chrome extension designed to block Google’s latest update – which is meant to enable targeted advertising – may not come as a surprise. Brave, another privacy-centric browser, was also quick out of the gate to thwart Google’s changes last month.
What has been surprising, however, is that the trend is growing. Players from across the board are joining DuckDuckGo and the likes in rejection of Google’s Federated Learning of Cohorts API (FLoC). FLoC is Google’s answer to the impending death of the third-party cookie. It proposes a new method of gathering user data for targeted advertising purposes. Microsoft-owned hosting platform GitHub and Mozilla Firefox have also announced they will be blocking the technology.
At a time when marketers are looking for simple answers to the increasingly complex question of how to target online audiences effectively, what does it all mean? What’s all the FLoCing noise about?
Understanding the basics of Google FLoC
Whereas third-party cookies track individual user activity across the web, FLoC proposes a more privacy-centric approach wherein users are grouped together (in ‘cohorts’) based on similar interests and behaviors. Each cohort is assigned an ID and users’ browsers – if they are not running a FLoC-blocking technology like DuckDuckGo’s Chrome extension – can identify the cohort to which the user belongs. According to Google, cohorts are large enough to create anonymity for users – and therefore increase data privacy – but are focused enough to enable advertisers to engage in effective ad targeting.
“FLoC [basically] takes your whole browser history and compresses it down to a single number,” says Don Marti, vice-president of ecosystem innovation at digital media company CafeMedia. “So you can think of your browser history like a big photo that you might download from your digital camera. And your FLoC cohort is a little thumbnail image. So it still gives some idea of what you’re looking at, but you can’t reconstruct the entire big picture just from the little thumbnail.”
Google is able to create FLoC cohorts by evaluating users’ browsing behavior. Based on the domains that a given user is visiting, the browser develops what’s called a SimHash. It then groups different users’ SimHashes together according to surmised interests. This information then becomes available to marketers.
“FLoC is designed to improve privacy by keeping users in large, indistinguishable groups instead of tracking individuals across the web as is done today with third-party cookies,” a Google spokesperson tells The Drum.
So... what’s the problem?
Google argues that FLoC will provide greater individual-level privacy – and will also serve as a suitable replacement to current tracking methods for advertisers. In fact, Google claims that FLoC will see 95% of the ad conversions per dollar that third-party cookies do.
But many don’t see it this way. DuckDuckGo, which is one of a handful of search engines that does not collect or share any data on its users, believes that FLoC does little if anything to improve consumers’ data privacy. “It’s relatively trivial to use other identifiers to link to these cohorts that will be created with Google FLoC,” says Peter Dolanjski, director of product at DuckDuckGo.
“It gets us back to the status quo of tracking people using a myriad of techniques, including things like IP address. We think that these proposals to replace some of the underlying tracking technologies with other tracking technologies – even though it might be slightly more private in the way in which they operate – are going to lead back to the same scenario that we’re in today, because there’s so much incentive in the ecosystem for data to be collected.”
Another key concern voiced by industry leaders is the potential discrimination that could come with grouping users into cohorts based on shared characteristics. “It is a fairly hard mathematical problem to tell whether something about your browser history reveals something about you that you wouldn’t want known by a site that you go to,” says Marti.
He says that data regarding users’ browsing behaviors could be used to target them in discriminatory ways. “You might imagine somebody who visits school sites. A school site ... is intended for kids and parents to all be able to see,” Marti says. “That’s probably going to be the absolute safest, least sensitive site in the world. But unfortunately, there are places where we have historic patterns of segregation. Which school sites you visit might reveal information about you that would show you’re a member of a racial or national origin group that that should not be used for discrimination. There’s more that can be done with [regard to] giving the user better control of how they represent themselves.”
A Google spokesperson, however, says that “Chrome has built into FLoC robust measures to remove groups that may be more strongly associated with sensitive topics such as race, sexuality or personal hardships, without learning specifically which sensitive topics."
It’s not just consumers who could be negatively impacted by FLoC. There are pitfalls for advertisers too, Dolanjski says. “Behavioral advertising isn’t a crystal ball. The numbers are not always trustworthy – Facebook is currently facing a class action lawsuit for inflating their reach numbers. Behaviorally-targeted ads tend to creepily follow people around the web, regardless of the content being looked at and if user intent has shifted – advertising resources are wasted, damaging brand reputation along the way. Contrary to popular belief, advertising did exist before it was possible to target people through digital spy tactics. And guess what? It worked!”
The question of value exchange
Still, many experts admit that there are potential upsides. The targeting enabled by FLoC can help marketers serve more relevant ads to audiences. “As a user, you want to get the least time-consuming, least risky ads possible for the maximum amount of incremental ad-supported content,” Marti says.
And many agree that the benefits far outweigh the potential drawbacks. Eric Seufert, analyst at mobile marketing firm Mobile Dev Memo, argues that for consumers, the relative tradeoff between data privacy and value is often worthwhile. “What I’m perceiving in the arguments against FLoC is that, to some people, there exists no tradeoff between privacy and utility for digital products: advertising should not be personalized, period, and ‘privacy’ is a binary state that demands that no data be collected from users, no matter how enthusiastically they provide it in return for relevant advertising or how abstracted it is away from identity,” he says.
Seufert tells The Drum: “The only substantive complaint being leveled against FLoC is that it’s being proposed by Google. The reality is that FLoC provides a novel, privacy-protective mechanism for delivering targeted advertising.”
Dolanjski, however, doesn’t buy the privacy-utility exchange argument. “When people understand what data is actually collected about them, they’re eminently opposed to targeted advertising based on their behavior,” he says. “It doesn’t matter whether [ads are] relevant or not.”
Indeed, the percentage of consumers who express that they are willing to share personal information such as home addresses and the names of their spouses is declining year after year. “If you simply ask the question, ‘Would you like relevant advertising?,’ of course people will say, ‘Yeah, of course,’” Dolanjski says. “If you said, ‘Would you like relevant advertising if it means the following information will be collected?,’ I can guarantee that almost nobody would say yes, because when you look under the covers and see how much data is collected – GPS coordinates, intimate details about your device and all kinds of identifiers – people are shocked when they find out.”
The future of FLoC
The proposed API is currently being tested among groups of users in the US, Australia, Brazil, Canada, India, Indonesia, Japan, Mexico, New Zealand and the Philippines.
The technology will likely undergo further updates before launching in earnest. Even Seufert agrees that FLoC is not without flaws: “Like any solution to any problem that involves tradeoffs,” he says, pointing to the fact that FLoC, in some ways, disincentivizes larger advertisers from opting into cohorting on their own sites “given the disproportionate value they provide relative to smaller advertisers”. He says that this and other issues must be addressed.
“The Privacy Sandbox proposals are developed as part of a collaborative, open-source effort to improve privacy on the web and we welcome feedback as we continue working with the W3C and broader web community,” says a Google spokesperson.
From a big-picture perspective, with so much division over the appropriate means by which to balance the demands of advertisers with consumer data privacy, the ecosystem is becoming ever more fragmented. If they haven’t already, many adtech companies have developed their own privacy-minded solutions to ad targeting in anticipation of the denouement of the cookie. As a result, there’s little agreement on what the path forward should look like.
“We’re going to see a bifurcated world: hyper-targeted identity-based advertising in places where the channels support it, and then big data-driven advertising – via Google, Apple and the others,” says Jules Polonetsky, chief executive at the Future of Privacy Forum, a Washington, DC-based think tank focused on privacy-related issues.
So much for everyone joining the FLoC.