Harvard’s Elizabeth Renieris: Privacy Is an Inalienable Right

By Jeff Benson

Last of two parts. (First part here.)

Worldwide, individuals remain at the whim of unfair data practices, says Elizabeth Renieris, a data protection lawyer and fellow at Harvard’s Berkman Klein Center for Internet and Society.

She now discusses how to rebalance the scales toward individuals.

Many technologists talk about data ownership as enabling digital privacy. Yet, you’ve written that they’re at odds. Why?

The way I think about privacy and data protection, though there is a difference, is through the lens of fundamental rights that are inalienable, which means they’re non-transferrable. So, you can never waive those rights.

The problem with most data-ownership models, and monetization in particular, is that the type of ownership that allows you to monetize something is typically a property-style framework. So, something akin to personal (or movable) property, or sometimes intellectual property.

Those things are monetizable because they’re transferrable, because they allow you to dispose of and do things with that data as property that you could not necessarily do under other legal frameworks.

So, there’s a direct tension between the inalienability and non-transferability of human rights and the alienability and transferability of property.

What alternatives are there to treating privacy and data protection through a rights framework?

I haven’t seen any effective alternatives.

Privacy is embedded in international human rights law, in actual binding instruments [from the 1966 International Covenant on Civil and Political Rights to the E.U. General Data Protection Regulation (GDPR)].

So, it’s not just an idea but, legally, privacy is enshrined in human rights as a fundamental, inalienable right. 

I hear a lot of technologists debating the legal status of privacy, but we have law on this; it’s not a blank slate. 

That aside, why is data commodification a bad idea on a practical level?

In addition to the logistical nightmare of drawing lines around data, how is an individual meant to have any way to fairly price their data, to do the complex analysis required, or to have any effective ability to bargain — given the asymmetries of information and power between individuals and the organizations harvesting data? 

Given that companies appear to have more power than individuals, do our current frameworks adequately protect privacy and data? 

If we were willing to really leverage them, we might stand a better chance. But we’ve been very conservative in their application.

If we actually leveraged the other 95% of what’s in the [GDPR] instead of just focusing on mechanisms for consent, for example, it would be potentially more effective. 

Another example is in the U.S., where the [Federal Trade Commission] is really not leveraging its Section 5 authority on unfair, deceptive practices.

They focus a lot on deception: “You said this in your privacy policy, you did something different, it’s deceptive.” They haven’t really focused on what’s unfair.

And if you think about something like [facial recognition] or these other very offensive data-related practices, we know inherently when they’re unfair.

That’s built into the law to be flexible, but it’s been under-utilized by regulators.

And data ownership wouldn’t improve any of that.

There are so many additional justice-related concerns about inequality, fairness, and discrimination.

Honestly, all it would do is sanction all of the really terrible practices we have now. It won’t actually change anything for the better.

Jeff Benson is a Nevada-based writer.


Links:

Jaron Lanier:
theverge.com/2019/4/9/18302076/data-monetization-control-manipulation-economy-jaron-laniers-virtual-reality-vr-vergecast

Andrew Yang: 
nytimes.com/2019/10/15/opinion/andrew-yang-privacy-internet.html

Blockchain-obsessed idealists:
wired.com/story/i-sold-my-data-for-crypto/

Elizabeth Renieris:
cyber.harvard.edu/people/elizabeth-renieris

GDPR:
Gdpr.eu