Press "Enter" to skip to content

Q&A: Bennett Cyphers, Electronic Frontier Foundation

Evaluating FLoC, Google’s third-party cookie alternative

By Rachel Looker

Cookies are dying out, in the words of Bennett Cyphers, a staff technologist for the Electronic Frontier Foundation (EFF). 

To replace the third-party data trackers, Google has put forward the Federated Learning of Cohorts, or FLoC, billing it as a way for advertisers to target users with content and ads by clustering groups of people who share similar interests. 

The idea behind FLoC is to collect information about a user’s browsing habits and use that data to assign them to a “cohort” with other users who have similar browsing data. In web browsers that have FLoC enabled, each user will have a FloC ID to identify their group.

Google started testing FloC in a pilot phase in Chrome on March 30, turning on FLoC in millions of Chrome browsers with most users unaware they were opted into the new technology.  

But Cyphers is having none of it. 

He has described Google’s FLoC as an “absolutely terrible idea” that increases the risks that come with behavioral targeting. To help users determine if their Chrome browser has been used as part of Google’s FLoC pilot, the EFF launched the “Am I FLoCed” website. 

Cyphers told Digital Privacy News that while FLoC reduces the privacy risks of third-party cookies, it brings on its own host of privacy concerns. 

This interview has been edited for length and clarity.

Why do you think we should be concerned about Google’s FLoC proposal?

I think in a nutshell, FLoC is Google acknowledging that people have serious concerns with the tracking advertising ecosystem as it exists on the web and on the internet today—as they should. 

But instead of doing the thing that I think most users want, which is just to get rid of that paradigm altogether, Google is trying to reinvent it in a way that sort of skirts around a lot of the regulations that have been put in place recently to try and govern that kind of behavior.

[Google] is also trying to future proof it and sort of lock in behavioral advertising as the way that business is going to work on the web for the next decade or two.

Does categorizing people into tens of thousands of “buckets” really protect people’s privacy? 

In some ways? Yes. Top answer, no. I do want to give credit. I want to be honest with the argument that Google is making and take them at their word because what Google is proposing is to replace the model right now.

So, the way tracking on the web works now is your browser will store unique identifiers for you on behalf of advertisers and then every time you make a request to an advertiser like a data broker or another tracker, that party will get a unique identifier for you. 

Then, they can use that to tie your activity on one page or on one app to all of the other activities that you’ve taken on the web or on your phone. 

Basically, there are dozens of different advertisers and data brokers who can collect big chunks of your browsing history and then use that to profile you and track you.

So targeted advertising is the ultimate goal?

Google is trying to move the tracking and profiling part of that equation into the browser so that, in their vision of the future, advertisers and data brokers won’t be able to see your exact activity. 

Instead, your browser is going to collect all of your activity and then process that down into one little label that says, this person is a member of this group and that group might have these specific attributes that are valuable for advertisers. 

But advertisers and data brokers won’t get to know every website you visited or when you visited them. They’ll just get to know, ‘All right, Google thinks that this person is this kind of a person.’

How significant is the introduction of FLoC?

If this happened in a vacuum, and if we totally got rid of unique identifiers and fingerprinting, and lots of other privacy-harmful technologies beyond cookies, I think this would be a small step forward. 

It would mean that some of the most harmful ways that tracking affects people would become a thing of the past. 

Right now, there are data brokers who will collect your entire browsing history and sell it to the highest bidder, and that can be governments, police departments, other data brokers, creditors, that kind of thing. 

That’s a really big problem and that kind of thing wouldn’t be possible in FLoC, if Google’s vision completely comes to fruition and FLoC is the only technology that can be used to track people.

So, is this a good thing? 

Even in Google’s kind of perfect world, there are still a lot of issues with categorizing people in this way and then sharing that categorization with advertisers.

Two of the big things that we like to harp on with targeted ads are discrimination and predatory advertising are really big issues today, specifically with behavioral targeting. 

Behavioral advertising allows advertisers to reach only the audiences they want based on very sophisticated understandings of how individual people act and so that means that they can discriminate against people.

What are some common examples of this practice?

They can only offer good opportunities like loans or jobs or housing to certain kinds of people that they want to reach, and exclude lots of other people. 

They can also do predatory advertising, like offering bad loans or scams to the people who they think are going to be most vulnerable to that kind of messaging. 

Same goes for political ads. They can hyper-target political ads based on what they know about how people have acted in the past and that can be a really powerful tool as we’ve seen in the past couple of elections. 

All of those kinds of behaviors are still going to be possible under FLoC even though advertisers aren’t going to be able to know exactly what sites you visited. 

They’re still going to have a really good idea of how you behaved and probably what kind of person you are and they’re going to be able to target ads based on that—and I think that is still a fundamentally harmful technology.

Google says the clustering algorithm used for the FLoC cohort model is designed to evaluate whether a cohort may be correlated with sensitive categories like race, sexuality or medical history. Are they analyzing everyone in order to determine this?

For now, yes, that is what they’re doing. Well, sort of. It’s both. 

First of all, the system that they have in place right now is supposed to be only temporary and they’ve said, and I kind of believe them, that there is going to be a privacy-preserving way to do this kind of auditing in the future. 

But for now, yeah. Chrome has this feature called Sync and if you turn on Chrome Sync, by default, Google will have access to all of your browsing history from Chrome regardless of whether you use an ad blocker or whatever. 

They have millions of people who have ‘opted’ into Chrome Sync and they can use this as sort of a baseline to figure out the particular FLoCs associated with particularly sensitive categories as they define it. 

Could you dig a little deeper into this for us? 

First, they don’t have actual information about whether someone is a particular ethnicity or a particular religion or makes a certain amount of money. They might have access to that but they’re not using it for this purpose.

What they’re doing is going through and categorizing every site on the web as either sensitive or not. If it’s sensitive, they’ll give it a label for a particular sensitive category. 

If you went to the website for the suicide hotline, they might give that website a label for ‘sensitive mental health.’ If you went to a website for addiction counseling, it might be like ‘sensitive addiction,’ that kind of thing. 

So, they’re labeling every site on the web. Then, they’re running a statistical analysis using the raw history data that they have from Chrome Sync tied to people’s FLoC IDs. 

This is Google’s “reinvention” of the tracking advertising ecosystem?

They’re trying to walk a fine line here. On the one hand, Google is trying to make sure that they don’t seem too creepy in the way that they try to stop creepy uses of their products. 

And I think if they wanted to, they could use more precise data about people’s actual, real life characteristics instead of using this sort of sloppy proxy based on whether you visited one particular site or not. 

But, on the other hand, they feel like they have to do something about this and so they end up with something that does sound a little bit creepy anyway.

Is this process just for the pilot phase of FLoC?

That’s the pilot. 

Moving forward, I think there’s going to be something built into Chrome that will anonymously share the particular sites that you visited and then they can do anonymous correlations between FLoC IDs and particular sites and particular labels on those particular sites. 

But I think the bigger problem isn’t that it’s creepy. I think the bigger problem is that it’s avoiding the bigger issue, which is that demographics and personality traits correlate with web browsing in really unintuitive ways.

 If you are trying to find a FLoC—say you’re an evil advertiser and you want to target older retirees who lean Republican with a particular donation scam—Google won’t tell you that there’s a particular FLoC that has that demographic in it. 

But there’s not going to be any one sensitive category that is a complete proxy for those people. So it’s like, ‘Oh, every one of the people that I’m looking for visited this one site.’ No, that’s not usually how it works.

It’s more like if you have tons and tons of data, as a lot of these data brokers and advertisers do, you can pull out this sort of second level of correlations and say, ‘I don’t know anything about the sites that these people actually visited, but I do know that people in this FLoC tend to be from these demographics and so I can just use these FLoCs as proxies.’

If Google is saying that these different cohorts will be very accurate, but at the same time saying they’ll be more private, does it matter if it’s “anonymous?” If you end up getting the same amount of information from a user?

No. It doesn’t and Google has been talking out of both sides of [its] mouth with this technology since its inception. 

On the one hand, they’re trying to convince advertisers that it’s going to be really useful to them. So they’re saying, ‘Oh, it’s 95 percent as effective as cookies. You’re going to be able to target whoever you want. You’re going to be able to reach people in really precise ways.’ 

On the other side, when they talk to privacy advocates, they’re saying, ‘Oh, it’s super anonymous. These groups are going to be really big. There’s no way we’re going to let it correlate with any kind of sensitive characteristic that you could imagine.’ 

Those two ideas just are in direct tension. Either it’s a useful proxy for the way people will behave and the things that they like and the kinds of people that they are, or it’s not. 

If it’s the former, it’s going to be good for advertisers and bad for privacy. If it’s the latter, it’s going to be good for privacy and bad for advertisers and no advertisers will use it. 

When it comes down to it, Google’s customers are not privacy advocates. They are advertisers. I think that if there are going to be trade-offs, I am afraid they are all going to go in the direction of more information and less privacy.

Some browsers have said they don’t support FLoC. What’s the state of things right now? 

I think there are going to be some browsers whose customers are their users and that’s all they care about and so they probably won’t adopt FLoC, like Firefox and Brave and hopefully Safari, but we’ll see. I don’t know if they’ve said anything officially about that. 

But I think a lot of the other browsers have just said ‘We don’t have any plans to adopt it yet’ and they’re just waiting to see whether it takes off. 

Chrome has unfortunately been kind of the only one that matters because they have two-thirds of the market share, at least globally. If Chrome adopts it, advertisers are going to have a huge swath of users that they can target using FLoC. I think it’s going to be hard for some of the [others] to avoid jumping on board. 

What you could very easily see is websites that put up pseudo paywalls that say, ‘Oh, your browser doesn’t use FLoC. We can’t serve ads, so we can’t show you any content. Please download Google Chrome in order to access our content.’ 

That’s what I’m really afraid of.

Does it say anything about Google’s transparency that users have been enrolled in this test run of FLoC without being aware of it?

I think that was an absolutely terrible idea. I can’t believe Google decided to go through with it.

It’s one thing to build this technology in the first place, which I don’t think should exist at all. But I think reasonable minds can disagree about that. 

The whole bargain with FLoC is supposed to be that we’re going to take away cookies and take away all these other kinds of tracking and replace them all with FLoC. We’re getting rid of the really bad stuff and they’re replacing it with this thing, which is less bad. 

But in this trial phase, they’re like, ‘Oh, no, we’re leaving all the really bad stuff and we’re going to add this other new bad thing and we’re not going to tell you about it.’ 

There’s no way to opt in, and for the first few weeks, there was no way to opt out. I think that was just a really bad decision. It felt like they were trying to rush something through or get this into Chrome before people noticed and before there was a chance for backlash.

I don’t know why they had to do that, but I think it was a really bad idea. I think it reflects very poorly on Chrome’s relationship with their users and how much respect they have for their users.

What are your hopes for the future of FLoC?

I hope it doesn’t get used.

Sources: