Press "Enter" to skip to content

Q&A: EFF’s Cindy Cohn

‘We’re Going to Die by a Death of a Thousand Cuts for Privacy’

By Nora Macaluso

First of three parts.

Cindy Cohn, executive director of the Electronic Frontier Foundation, has advocated for privacy on issues ranging from privately developed surveillance technology to government spying to human rights.

In 2005, she also led the foundation in a national class-action lawsuit against Sony BMG, arguing that the company had included a flawed and overreaching computer program in millions of music CDs sold to the public.

The entertainment behemoth ultimately settled with the Federal Trade Commission.

Cohn, 57, has helmed EFF since 2015, after serving as legal director, as well as its general counsel, for 15 years. She is a graduate of the University of Michigan Law School, the University of Iowa and the London School of Economics.

Before joining the EFF in 2000, she was a civil litigator in private practice handling technology-related cases. The nonprofit was founded in 1990.

In the first of a three-part interview, Cohn told Digital Privacy News that 2020 was “good because it was so bad” for privacy, explained EFF’s neutral stance on California’s Proposition 24 amendment to the state’s consumer privacy law and discussed the organization’s other efforts.

This interview has been edited for length and clarity.

What was the biggest privacy story of 2020?

It was good because it was so bad.

I think that the kind of normalization of mass-surveillance techniques by police and private companies are neck-and-neck for the biggest privacy concern.

It’s hard to say which is worse.

We know during the Black Lives Matter protests, lots and lots of governments were doing mass surveillance of people engaging in their First Amendment rights of assembly.

The Google and Facebook antitrust lawsuits show that “as these companies gained control of more and more of the market, they’ve offered us a worse and worse deal on privacy.”

At the same time, we know companies like Facebook and Google’s surveillance models became more widely understood by more people — and, thankfully, we’re beginning to see some efforts to try and combat them.

How?

If you look at the antitrust cases filed against Facebook recently — and also mentioned in Google’s — is how, as these companies gained control of more and more of the market, they’ve offered us a worse and worse deal on privacy.

A reduction in privacy is one of the harms that’s mentioned in these cases.

That is, to me, really noteworthy and important — and not at all something that was obvious.

It’s obvious in our lived experience, but the case law for antitrust led many people to think that wasn’t possible. 

What were EFF’s biggest privacy victories this past year?

Our work on the “Atlas of Surveillance” project is something I’m really proud of.

EFF has built a system tracking all the surveillance technologies that are being acquired and used by law enforcement all across the country.

You would think it would be a matter of public record, but it’s not.

It takes serious work by EFF — along with a lot of students at the University of Nevada (Reno) Journalism School, who help us with this project, as well as volunteers across the country — to try to give Americans a picture of just how much surveillance is being deployed against them by law enforcement.

Photo by Erich Valo.

Apple’s iPhone changes mark “a huge step forward, giving people both the knowledge and ability to control where their data goes.”

It’s a big project, and it’s getting bigger every day.

It’s beautifully displayed — and it really is a huge step toward helping people see what’s going on in their backyards around surveillance. 

Anything else?

We have seen facial-recognition bans in localities across the country, where local people — and we’ve been able to assist them — have stood up and said: “Look, we don’t want the police using facial-recognition technology on us, because it’s just too dangerous for freedom of expression, freedom of association, freedom of assembly.”

On the government side, those are big victories that I’m happy about.

On the private side?

We saw Apple take a big step and offer basically a menu choice when you’re using your iPhone about whether you want the apps you use to engage in third-party tracking.

So, if you’re downloading a weather app, and it’s going to be selling your data or offering your data to other people, the app has to ask you whether you want to or not, and you can say “no.”

“Google’s business model goes in the other direction, so we’re not seeing those kinds of (privacy) changes with Alphabet products.”

That’s a huge step forward, giving people both the knowledge and ability to control where their data goes.

It’s a big step we want other companies to follow.

What else is EFF working on?

We’ve continued to develop our tool, Privacy Badger, which also lets you control third-party tracking through a plug-in to your browser.

More people use some version of a tracker blocker than ever before — and that’s a powerful voice.

The browsers are starting to take notice and build more control into a browser. We want to see that continue.

Mozilla with Firefox is probably the best in class, but we’re seeing Apple with Safari doing a lot of work.

Of course, Google’s business model goes in the other direction, so we’re not seeing those kinds of changes with Alphabet products — although they’re doing more too.

Chrome is doing a little-bit more, but not much.

Can you discuss Proposition 24, which was approved by California voters in November?

The EFF was neutral on the amendment to the California privacy law.

We’re starting to hear from states across the country that are starting to experiment and come up with baseline privacy laws.

So, we may see more action in the states than we see federally.

Proposition 24 is going to create a privacy agency.

We’d rather see you, the user, empowered to protect your own privacy rather than outsourcing it to an agency.

On Prop 24, “Government agencies can be kind of toothless, whereas if you sue to protect your data, it’s always there for you.”

But if the agency’s smart and aggressive, it can do good things.

It was really a mixed bag. 

How so?

It has loopholes about when companies can use your data that were too broad for us.

It doesn’t have a private right of action, so you can’t defend yourself.

Instead, it creates this agency, which I think is second best.

Government agencies can be kind of toothless, whereas if you sue to protect your data, it’s always there for you. 

What data-privacy implications came out of the 2020 elections? For example, some people were concerned that their party-affiliation records were public. Is that as a problem?

It is public. For a lot of people, the realization that it’s public is the moment.

If people want to change that, that’s a reasonable thing.

So, yeah, we could close that down.

I like to think holistically about privacy, rather than topic by topic.

Without “holistic, baseline” privacy rules, “We’re going to die by a death of a thousand cuts for privacy.”

Meaning, we have this weird stovepipe of rules right now: Your video-rental records are really highly protected, but what you read online isn’t.

To me, party affiliation ought to be one of the pieces of information where you need to give consent and you need to have control.

If we think about it from the perspective of one particular piece of information and then the next particular piece of information and then the next, we’re going to die by a death of a thousand cuts for privacy.

We need holistic, baseline privacy rules that require people to get your permission if they’re going to gather your data, permission if they’re going to use it for a secondary purpose — and that give you a private right of action to get relief if they don’t.

Wednesday: 2021 outlook and a national privacy law.

Nora Macaluso is a Philadelphia writer.

Sources: