Google FLoC

With Big Tech phasing out third-party cookies, Google is pushing something called FLoC (Federated Learning of Cohorts), a new technology that allows content creators to capture advertising revenue and protect the user privacy. 

A magnate for controversy in tech circles, public awareness about FLoC is spotty. Here’s what you need to know:

What is FLoC?

FLoC is a web browser standard. It categorizes users into “cohorts” based on interests minus any personally identifiable data. 

In the simplest terms, if you look for a Thai cookbook online and around the same time browse a blog dedicated to fishing, you may find yourself in a cohort of people interested in fishing and Thai food, and be served ads on other webpages relating to their specific interests.

Google’s FLoC assigns an ID number to specific cohorts, which have a few thousand people. If a Thai food and fishing cohort only has a few members, administrators may add it to a cohort interested in Southeast Asian food and outdoor activities to create a better target for ads. When a user from this cohort visits new webpages, their browser provides the administrators of each site with the FLoC ID that ties them to Southeastern Asian food and outdoor activities, which in turn provides everyone in the advertising universe more targeted information about a person without needing any identifying information.

Google says it’s a win-win proposition: Advertisers can target ads using granular and specific criteria, and customers see relevant advertising without their personal information being tracked and shared. Content providers get revenue for traffic to their sites. As the song goes, “Nothing but blue skies from now on.” But is it really a more perfect union of user privacy and ad sales? 

The downside of FLoC

Although FLoC solves many issues associated with the tracking of users online, privacy advocates think the technology may create a new issue: an unwanted specificity of interests displayed on a user’s screen for anyone–family and strangers alike–to shoulder surf. 

“If you visit a site for medical information, you might trust it with information about your health, but there’s no reason it needs to know what your politics are. Likewise, if you visit a retail website, it shouldn’t need to know whether you’ve recently read up on treatment for depression. FLoC erodes this separation of contexts, and instead presents the same behavioral summary to everyone you interact with,” wrote Bennett Cyphers, a staff technologist with the Electronic Frontier Foundation. 

FLoC’s critics also point to the potential for exposing users to physical danger. Repressive government regimes could specifically target citizens by the cohorts they’re in.

“A dictatorship may be able to work out that dissenters often seem to have one of the same five FLoC IDs. Now anyone who visits a nationally controlled website with that ID could be at risk. A country that outlaws certain religions or sexualities could do the same,” announced Vivaldi, makers of a pro-privacy web browser, on its company blog rejecting the technology.

Google has responded to this criticism by indicating that it would exclude “sensitive” categories, including race, religion and sexual orientation, a policy that has raised even further concerns and debate.

“Whether a behavior is ‘sensitive’ varies wildly across people,” a blog entry on the pro-privacy browser Brave pointed out in their announced opposition to the technology. “One’s mom may not find her interest in “women’s clothes” a private part of her identity, but one’s dad might (or might not! but, plainly, Google isn’t the appropriate party to make that choice).”

Others have raised concerns about the way Google has implemented and tested the technology. The company simply added FLoC to Chrome web browsers installed on “a small percentage of users” without informing them, or allowing them to opt out.

What does a “small percentage” look like in real life? 

“Take into account that nearly-ancient estimates (circa 2016) put active Google Chrome users around 2 billion [which means] that the FLoC trial could affect up to 100 million people,” wrote David Ruiz of Malwarebytes. “That is an enormous number of people to subject to a data analysis experiment without their prior consent.”

Does the opposition to FLoC matter?

While Apple, Microsoft, Mozilla, Vivaldi, Brave and others have expressed opposition to the technology, Google’s online search dominance, combined with its advertising, web content, and web browser market are significant. Even if Google were the only Big Tech company supporting FLoC, its implementation there would likely still make FLoC the new standard.

Privacy considerations for FLoC notwithstanding, third-party cookie tracking has been widely abused by advertisers and data mining firms, and meanwhile much of the current ecosystem of the web needs that whizbang to generate revenue. As third-party cookies are phased out, FLoC-based advertising models will be a failsafe for many sites. 

The nascent market dominance of FLoC may depend on a lack of public awareness. Companies that rely on privacy-invasive technology need to steer consumers away from the sausage factory if they want to keep selling salami. If we get the word out, FLoC can be calibrated to make sense not only for Google, but for the billions of people who are affected by the decisions that company makes.