Q&A: Ethan Zuckerman at UMass

What Infrastructures Bring a ‘Healthy Online Life?’

By Mukund Rathi

Ethan Zuckerman is a visiting research scholar at the Knight First Amendment Institute at Columbia University and a new faculty member at the University of Massachusetts at Amherst. 

He has started the Institute for Digital Public Infrastructure at Amherst, which advocates for treating internet platforms as public spaces and public goods, “much as public television and radio have complemented commercial broadcasting,” according to the institute’s website.

Formerly the director of the Center for Civic Media at the Massachusetts Institute of Technology, Zuckerman created pop-up internet ads in 1997. In 2014, he apologized for unintentionally creating one the internet’s most despised forms of advertising.

But he told Digital Privacy News that this new approach regarding platforms would help protect privacy and help users control their data online.

What is a “digital public infrastructure”?

Broadly speaking, infrastructure are tools and systems that we build not necessarily as an end in themselves, but so we can build other businesses and services on top of them.

We don’t build power grids because it’s really exciting to have power grids. We build power grids so we can light up our houses and businesses.

There’s also a category of things that we might call “social infrastructure.”

These are things we build because they help keep us together in spite of our differences — such as the town hall, and, controversially at the moment, policing and public-safety systems.

Online, the question is: What are the infrastructures that lead to healthy online life?

We have a lot of things that were built for for-profit purposes, but then we use them as places to live our public lives.

It’s like going to a shopping mall for a rally rather than having fun.

There’s great effort around internet issues like community broadband. (There are) online areas where we need to consciously build public-focused alternatives. 

Those include social media, search engines and monetization.

I’ve also heard good arguments that we need public alternatives for video conferencing or for educational tools. 

What is “surveillant advertising”? How is it related to a digital public infrastructure?

I’m borrowing from Shoshana Zuboff’s book, “The Age of Surveillance Capitalism.”

Our notions of privacy have exploded over the last 10 years by the massive business of collecting information on people, packaging it together and reselling it.

This includes not only personal data, like who you are and where you live and what you buy, but also behavioral data, like what websites you look at.

You can find yourself getting cut out of job opportunities by virtue of incorrect data or algorithms deciding against you.

In addition, Facebook execs have openly said they set up their system to promote highly emotional, highly engaging content that users will react to and share.

If you want people to keep clicking on things, if you want people to react to them — if you want them to say, “Oh that’s terrible,” and get angry and share it — your algorithms will learn to promote certain content that is not necessarily informative or good for us as citizens. 

But isn’t data for services a fair exchange under capitalism? Alternatively, wouldn’t users just pay for services?

We aren’t being given an alternative.

There are calls to boycott Facebook — but it’s really hard to say that to someone in Myanmar, for instance, where the internet came very late to the country and pretty-much everyone moved onto Facebook.

And for all of us, the local community conversations about the school board and power outages and lost dogs are happening on Facebook. 

I do think you should be able to pay to join a network and to opt out of targeted advertising and surveillance of your behavior.

As a legislator, I would require Facebook to offer a surveillance-free product, capped at two-times its average revenue per user per year.

But just getting rid of surveillance for those who can afford it doesn’t actually change how the network is built.

I’m asking people to imagine a really different network: Imagine building a network to deepen ties across racial or cultural divides?

You wouldn’t want the most emotional and outrageous content to go to the top of people’s feeds.

You consider Firefox and Wikipedia examples of digital public infrastructures? Why?

Wikipedia is about free human knowledge: Free in the sense that everyone can contribute and that you don’t have to pay for it.

When you start with that logic of freedom in that fashion — no one should pay for this, and we should all be involved in governing it — you get a very different community than something like Facebook.

Firefox also manages to succeed as a nonprofit in a for-profit space.

Unlike Safari or Chrome, Firefox’s goal is to empower the user with privacy and control. They sell default search-engine space to generate revenue.

What is “interoperability”? How can it promote user control of data?

Interoperability measures the extent to which a platform’s infrastructure can work with others.

Google and Facebook have given us a particularly bad version of interoperability, through the single sign-on processes.

You can log onto Tinder through Facebook, for example.

They run single sign-on so they can track your behavior across this network.

Alternatively, you can use an aggregator on your phone, which has your credentials and gives basic information to any service that you want.

For example, the aggregator could come up with a unique username for you and then export as much of that information as you permit.

You mentioned that policing is a controversial form of infrastructure — and have written about “sousveillance” of police. What is that?

In our modern world, many public spaces that we move in are under surveillance, watching from above.

Sousveillance is watching from below.

Consider the Black Panthers in Oakland in the mid-to-late ’60s, who armed themselves and drove around, following Oakland Police cars.

When the Oakland Police pulled someone over, the Panthers pulled up nearby with guns drawn to observe the police making their arrests.

Implicit in this was that if you’re harassing Black or brown people, we’re going to have a problem.

The Panthers’ approach is to go out in situations where abuse might be occurring and put an eye on it.

It’s an example of counter-power.

I was, initially, pretty enthusiastic about the pervasiveness of cameras.

The police officer who killed Walter Scott (in North Charleston, S.C., in April 2015) was charged with murder, thanks to bystander Feidin Santana’s video recording.

But a massive study of police body cameras showed no significant influence on violent incidents from wearing the cameras.

That’s because in many police departments, police know that they’re largely free of consequences.

If surveillance works for Google and Facebook, why not for people recording the police?

It’s about the concentration of power and concentration of data.

Facebook is watching you over some large chunk of the web, and there’s an even larger chunk of the web for Google.

They can turn that data into money and spend that to get more power. 

But individuals can’t do this.

Users can criticize Twitter but they don’t have the power to force (CEO) Jack Dorsey to answer their complaints.

The mere accumulation of data is not helpful. It has to be combined with power and the ability to extract revenue.

To make an impact, we have to commit over a sustained period to building genuine alternatives.

Mukund Rathi is a Washington writer.


Filed under: