Q&A: STOP’s Liz O’Sullivan

Surveillance Disproportionately Affects Vulnerable Communities 

By Jeff Benson

Last year, with help from the nonprofit Urban Justice Center in New York, privacy advocates formed the Surveillance Technology Oversight Project (STOP). Their goal was to litigate against oversurveillance and push legislation that protects the rights of marginalized communities, who often are most affected by that surveillance.

STOP Technology Director Liz O’Sullivan told Digital Privacy News that government surveillance was a hallmark of the New York City Police Department (NYPD), which has used it to target Muslim Americans after the 9/11 attacks and could now quiet Black Lives Matter protests.

Is mass surveillance inherently discriminatory?

The Black Lives Matter movement seems to be pulling to the forefront the fact that surveillance has always been historically, disproportionately deployed against minority communities, poorer communities and other vulnerable segments of society. 

This is not new. It goes back to the very beginning of New York, when there were laws in the (18th century) that required African Americans to carry lanterns at night to mark their presence so police could see them and keep tabs on them from afar. 

So, we see this constantly. Cameras are present on blocks that are typically becoming gentrified — that are not historically surveilled until things start to change in the neighborhood.

To me, it’s also that these technologies don’t work that well on darker skin — and so the deployment of these technologies means there’s an inherent enhanced risk of false arrest and false accusation and incarceration.

STOP has sued to stop NYPD from making arrestees remove religious head coverings. How does that relate to surveillance?

We defend the rights of the most marginalized communities.

In New York, a big part of that has to do with Muslim surveillance in the wake of 9/11. The fact that in order to add somebody to a database, (you) ask them to violate their religious beliefs — that’s unacceptable.

It should be revolting in this country.

Is surveillance an extension of existing policies that target minority populations?

Surveillance to me is just … one symptom of this power imbalance, the disparity we see between the people who are building the technology, the people who are designing it and selling it and then using it — and the people it’s being used on.

It’s not a technology problem. It’s a society problem.

The technology, however, has been moving at such a quick rate that it’s outpaced laws in a really dire way.

That makes it that much easier to surveil from not just one cop car on one block but at scale the entire city — perhaps through aerial surveillance or social-media monitoring or subpoenaing records from people’s own smart homes. 

You mentioned Black Lives Matter. Is law enforcement using surveillance to target people who committed crimes during protests or to undermine the protests themselves?

It’s very hard to say definitively one way or the other. There are some examples … that hearken to what could be going on. 

During the protests, we saw Customs and Border Patrol fly, outside of their 100-mile jurisdiction, a Predator drone armed with cameras, not guns.

This was a wide-angle aerial view that allows you to record the movements of all the people underneath — and in some cases, can allow you to see faces.

We also know there are tools being used by police departments around the country like Clearview AI, which allows you to take a photo of a person (whose identity you don’t know) and connect that to any web content that might exist, whether it’s a social media post or a blog you’ve written or an article you got quoted in.

If there are photos of you online, then they are in Clearview’s database — and that means you are now at risk of being falsely arrested or falsely accused of a crime.

So, the technology to find protestors is prevalent. It’s widespread.

The question of whether they’re using tools like the drones and Clearview and the facial recognition to track protestors is a significant one.

The trouble is they don’t have to disclose when they used it.

So, when a court case comes along that has flimsy connections between how it happened and who they think did it, nine times out of 10 you might think there was facial recognition involved there, but we don’t know — and we can’t know.

Jeff Benson is a Nevada writer.

Sources (external links):