Press "Enter" to skip to content

Q&A: Latanya Sweeney, Harvard University

‘Privacy Has Often Been Polarized’

By Gaspard Le Dem 

First of three parts. 

In the world of digital privacy, few have made as profound an impact as Latanya Sweeney.

A Harvard University professor and the director of the university’s Data Privacy Lab, Sweeney was a doctoral student when she co-published a groundbreaking paper on “k-anonymity” in 1998.

The study sent shock waves through the computer science and medical communities, leading to an overhaul of federal HIPAA standards. 

But Sweeney didn’t stop there. After becoming the first African American woman to earn a doctorate in computer science from the Massachusetts Institute of Technology in 2001, she founded the Data Privacy Lab at Carnegie Mellon, leading dozens of experiments that reshaped privacy policy.

In 2013, Sweeney went on to serve as chief technologist at the Federal Trade Commission (FTC) under former Democratic President Barack Obama. 

Still, Sweeney published pioneering research, including more than 100 academic papers on topics from facial de-identification to voter fraud.

Most recently, she sought to restore public trust in America’s electoral system by launching “VoteFlare” ahead of the January runoff elections in Georgia.

With VoteFlare, “We needed something to help rid people of their anxiety around voter registration to increase their trust in election results.”

The website alerts people to changes in their voter registration status to combat voter purges.

In the first of a three-part interview, Sweeney told Digital Privacy News that people shouldn’t need to give up their privacy to enjoy the benefits of technology.

This interview has been edited for length and clarity. 

How did VoteFlare come about?

After leaving the FTC, I could see a tidal wave of changes coming to our society by technology design.

By 2016, I began to question how technology could disrupt elections.

My students and I were the first to point out persona bots, bots that gave fake information but had human followers. 

Later, we pointed out that voter registration websites were vulnerable. There’s anxiety on both sides around voter registration.

So, our question going into 2020 was: What could we do to offer some clarity, transparency and some calmness? 

VoteFlare is like credit-monitoring, but for your voter registration: You sign up, and the system will automatically check your voter registration — and, in the case of Georgia, your mail-in ballot status, to show you whether or not your ballot was received.

The site sends you an alert if something goes wrong or needs your attention.  

Why did you launch VoteFlare in Georgia? 

Because of the coronavirus, we weren’t able to implement VoteFlare in the 48 states.

But we were able to set up VoteFlare in time for the Georgia runoff elections — and people used it.

It’s actually quite exciting, to tell you the truth.

The reason for Georgia was because of the state’s history of voter purges.

In September, 200,000 voters were wrongly removed from voter rolls. That has led to a lot of distrust in Georgia.

We needed something to help rid people of their anxiety around voter registration to increase their trust in election results.

What was the feedback?

We got about 100 people in the first hour or so. We were pretty excited about it.

I don’t have an updated total, but I can say that the system performed more than 10,000 registration look-ups flawlessly.

“We really want the benefits of these new technologies, but we want them with privacy guarantees.”

The biggest user feedback seemed to be the release from anxiety that the service provided.

Voters seemed to exhale and trust that they would be alerted if something went wrong or needed their attention.

Now that you’ve tested VoteFlare, what would you do differently?

We announced too late to help most people.

We had intended to launch VoteFlare for the entire 2020 election cycle, but the pandemic set us back.

We were late getting started but delighted to stand up the service in time to help. 

We are adding more people to the team to support more upcoming elections.

States provide websites for people to go and look up their information.

But it is so much more convenient and relieves stress to have a service that just lets you know if you need to fix something with your registration. 

On another issue, you worked at the Federal Trade Commission? What was it like?

The Federal Trade Commission was the de facto police department of the internet, because so many internet companies are American.

The FTC looks out for false advertising, monopolies and unfair practices for consumers. It was a fantastic experience. 

Working at the FTC is how I realized that privacy was just the first wave — that technology was really the new policymaker, and that the people who design technology are dictating the rules we live by. 

The chief technologist position at FTC has been vacant since 2018. Should the Biden administration fill the role? 

The Biden administration should absolutely fill the role. But I would also like to see a bureau of technologists.

Why?

The FTC is primarily a lawyer shop.

You want a bureau of technologists, because a lawyer can only understand technology as a black box — but the knobs inside that box are incredibly important for nuancing policy.

Meaning?

When you’re looking at a black box, you get a take-it-or-leave-it, yes-or-no answer.

That’s not optimal for society and technology companies, or for privacy.

It’s much better to be able to get in that box and tweak it.

“The FTC should be a leader, in terms of consumer protection.”

Ultimately, we really want the benefits of these new technologies, but we want them with privacy guarantees. And there’s no reason we can’t do that.

So, what’s holding us up?

What stops us from being able to do that is economic interests.

But it’s also that the people who could make these decisions –– the regulators, the advocacy groups –– they don’t have the right knowledge or tools to provide more nuanced discussions.

As a result, we get kind of sort of a draconian “yes” or “no,” when nuance would be much better.  

The FTC should be a leader, in terms of consumer protection — because a consistent theme in our discussion has been this idea that privacy was just the first wave.

As the issues have expanded beyond just privacy at the FTC, there’s a real need for not only a chief technology officer, but also a bureau of technologists. 

Should people of color be more involved in privacy discussions at the FTC? 

Definitely — and not just at the FTC. 

There’s a life cycle to technology.

In the beginning, the person who has the most control is the technology designer, because they’re making design decisions.

At that point, a change in design is so subtle and easy, it’s almost inconsequential. 

But once the technology is set, somebody’s got to figure out how to make money from it.

That combination of the technology and the business practice becomes so solidified that by the time the product gets to regulators, like the FTC, they can’t figure out which is which.

By the time the product hits the marketplace, consumers are left with a take-it-or-leave-it proposition. 

How can we introduce change before it’s too late then?

As you move through these decision points in the product’s life cycle, it becomes harder and harder to make a small, subtle change.

But the change would have been easiest to make in the design stage.  

If you have people who are conscious of how design can affect issues of race or gender, then you’re going to catch design problems really early — because the people themselves are witness to it.

If you catch a flaw at the point where you’re putting the business package together, it might be too late.

That being said, you still want a diverse group of people in that group, too.  

Diversity and inclusion are important throughout the entire life cycle of technology, because what happens at each point in that cycle is very different. 

Will the California Privacy Rights Act (CCPA) have a measurable impact on consumer privacy? 

The effort is a good one. It certainly brings a lot of attention to the issue.

Whether it will have staying power, is going to depend on the Biden administration. 

How?

This is one of those spaces where other states and the federal government have a lot to say about what will be the long-term outcome.

It would be nice to see tech companies move in a more proactive manner, as opposed to taking a reactive posture. 

They’re in the best spot to make a subtle design change, which could have huge privacy benefits.

“Diversity and inclusion are important throughout the entire life cycle of technology, because what happens at each point in that cycle is very different.”

But we need them to have that mindset and that disposition.

That will go so much further such that these regulations aren’t necessary.

That’s where we really want to get to. 

Can the California Privacy Rights Act (CPRA), which voters approved in November, inspire other states to reinforce privacy laws? 

I am hopeful.

The climate of privacy that was clearly revealed from contact-tracing is in a very different place than it has been in decades. 

Also, it’s one thing when you view the tech companies as young guys trying to show us what technology can do, versus multibillion-dollar companies that are controlling everything.

The way we think about them, and the trust relationships we have — as a society — have also shifted. 

Then, we have a new administration coming in. I think the tea leaves are arranging themselves for action in the privacy space that we haven’t had for a long time.

That’s encouraging, isn’t it? 

It’s encouraging and discouraging. 

It’s encouraging because it’s going to get attention.

It’s discouraging, because if it becomes a take-it-or-leave-it discussion, that’s probably not going to help us. 

We want society to enjoy the benefits, but with privacy guarantees. 

Could discussions around privacy become polarized?

Yes, of course. But privacy has often been polarized, too: “Privacy’s dead, get over it,” versus the idea of personal autonomy.

If the choices are really stark and polarized, we’re not likely to get a good outcome — because you don’t really want either side to prevail, right?

So, what I look for is the sweet spot.

What’s that?

The sweet spot requires subtlety: the ability to get under the cover of the technology, and good intentions with respect to what we want to achieve. 

I have a plot I often show in class where privacy is on one axis and utility is on the other axis.

As a society, we tend to trade one for the other.

“I think the tea leaves are arranging themselves for action in the privacy space that we haven’t had for a long time.”

But what we really want is something in that graph’s upper corner, something that has maximum utility and maximum privacy.

When we polarize them in political discourse, we rarely have that conversation. 

So, we’re swinging on a privacy-versus-utility axis? 

Yeah. And neither is optimal.

The privacy advocate might say, “Who cares about the utility?” while the person who cares about technology might say, “So much for privacy.”

That’s a losing situation, either way. 

What I care about is that sweet spot, where I can get the technology and the privacy. 

Wednesday: On k-anonymity and why federal laws aren’t protecting health data. 

Gaspard Le Dem is a Washington writer.

Sources: