Press "Enter" to skip to content

Q&A: Lorrie Cranor of Carnegie Mellon University

‘Most People Value Their Privacy a Lot’

By C.J. Thompson 

Lorrie Cranor is a longtime champion of privacy and security issues.

A professor of computer science, engineering and public policy at Carnegie Mellon University, she also serves as director of the university’s CyLab Usable Privacy and Security Laboratory and is co-director of the MSIT privacy engineering masters’ program.

For more than two decades, Cranor’s research has illuminated usable — consumer-friendly — privacy and security technologies and methodologies.

Cranor, whose doctorate is from Washington University in St. Louis, served as chief technologist for the Federal Trade Commission in 2016.

She and CyLab researchers recently designed the blue “opt-out” icon now used on many websites to alert consumers how to decline the sale of their data. It resulted from amendments to the California Consumer Privacy Act (CCPA) in March.

Cranor also co-wrote a paper about the surveillance of communication of incarcerated people — and her current research is focused on usable privacy choices, secure passwords, privacy decision-making and cybertrust indicators.

She told Digital Privacy News that most people care about privacy but are not sure how or if they can protect it.

What is the difference between online privacy and online security?

Security and privacy are overlapping concepts.

Security is more about the mechanisms for protecting information. Privacy is more about what the rules are.

“We want to get companies’ privacy commitments on the record so that we can hold them to these commitments.”

A company can have great security and do a good job of keeping all the attackers out, but have poor privacy because they sell their customers’ personal information.

Or, a company can promise to keep all their customers’ information private — but due to poor security, they have a data breach and the information gets exposed.

In 2008, you drew attention to lengthy privacy policies that consumers largely don’t read. Have things evolved on that front?

The paper I wrote with Aleecia McDonald (a Carnegie Mellon assistant professor) on “The Cost of Reading Privacy Policies” is one of my most-cited papers by academics and in the popular press.

It validates what everyone already knew: Privacy policies are long — and it isn’t reasonable to expect people to read all of them.

Despite the fact that everyone knows this, we still use them because they actually can be useful, as long as we don’t expect consumers to read them.

We want to get companies’ privacy commitments on the record so that we can hold them to these commitments — and also so that we have the ability to call out particularly egregious practices. 

Have any major changes been made?

I don’t think much has actually changed about privacy policies since 2008, other than the fact that they mostly have become longer due to regulatory requirements.

We need to have a better way to actually inform people — and some of the egregious practices we find in policies probably just shouldn’t be legal. 

What are “dark patterns” and what is the broad goal of companies that employ them on their websites or in software?

Dark patterns are user interfaces that nudge users to make choices that they otherwise probably wouldn’t make.

They may nudge users to accept cookies or grant permission to uses of their data — or they may even nudge users to make purchases they don’t actually want.

Sometimes it’s more than a “nudge.” It is outright deception: Users may think they don’t actually have a choice or they may think they are making one choice when they are actually choosing something else.

How would you describe the level of public awareness on this issue? 

Among privacy researchers, there is a lot of awareness of dark patterns — but my sense is that the public is much less aware.

We are starting to see prohibitions against dark patterns in state legislation, so it may be only a matter of time before we see it at the federal level.

“Dark patterns are user interfaces that nudge users to make choices that they otherwise probably wouldn’t make.”

If states start implementing conflicting provisions that make compliance complicated, I expect it will increase interest in federal legislation.

You helped design the “opt-out” icon that gives California residents the ability to not have their data sold. What issues are alleviated?

Our team from Carnegie Mellon and the University of Michigan designed the privacy-options icon to signal to consumers where to click on a link to find all of a website’s privacy choices in one place. 

Our research has shown that opt-out links and other privacy choices are not in any consistent place on websites — and it’s hard for people to find them.

So, if companies adopt this icon, hopefully it will raise awareness that privacy choices exist and help people find them. 

What was the process like?

We did a very fast three-month sprint to design the icon within the California attorney general’s public-comment period in the winter of 2019-20.

We came up with lots of rough icon concepts and then had design students turn them into actual icon designs.

We went through multiple rounds of testing on Amazon Mechanical Turk to refine these designs and select those that were most promising.

What challenges did you encounter?

We learned that the concepts of privacy choices and not selling personal information are hard to represent visually without words, so having words next to the icon is going to be important, at least until people learn to recognize the icon.

Some of the concepts we tested led to a lot of misconceptions. 

What concepts did the winning design communicate?

The icon we selected clearly conveyed the concept of settings or choices.

After we submitted our report to the California AG’s office, they proposed a different icon — and we did another experiment to demonstrate that their icon raised a lot of misconceptions.

“Our research has shown that opt-out links and other privacy choices are not in any consistent place on websites — and it’s hard for people to find them.”

It took a while, but eventually they came around to adopting the icon we proposed about a year after our initial report.

What are your impressions of Apple’s privacy labels? 

We’ve been doing research on app privacy labels for the past decade, so it’s great to see that Apple has implemented them.

There are certainly some concerns about them — and I would love to see some research on how well people actually understand them and can use them.

But, hopefully, Apple will refine them over time.

Are there concerns about veracity or apathy from developers, or about enforcement from Apple or the government?

Apple is going to need to do some spot-checking and automated enforcement to make sure the labels are accurate and perhaps provide some better tools for developers to make it easy for them.

It would also be great to see search tools that allow you to find apps that are similar to ones you are looking at but might have better privacy.

As people start paying more attention to these labels, it will help shed light on which apps have better or worse privacy — and our research suggests that it will influence the apps people select.

What would it take to implement a “nutrition label” for devices in a broad way?

We’ve been working on nutrition labels for IoT devices that would cover privacy and security. 

We have a solid, tested proposal for a label, but now we need companies to be interested in actually using it.

As an academic researcher, I can design it and put it out there — but there’s only so much I can do until industry buys into it or regulators require it.

Regarding IoT devices, why do you think privacy and security info is difficult to locate? Is that intentional or an oversight? 

IoT manufacturers have not prioritized security and privacy.

It tends to be on the back burner until they do something that gets them into trouble.

Maybe some of them hide it on purpose, but my guess is most of them just don’t make it a priority.

How divergent are consumer priorities versus manufacturer priorities? 

Consumers and manufacturers both want good security, but they may have more divergent views on privacy.

There’s also a question of tradeoffs.

Building in good security that is also easy to use might be more expensive.

While consumers may want it, they may or may not be willing to pay a lot more for it. 

What are the most notable things you’ve learned about consumer habits and motivations regarding privacy?

Most people value their privacy a lot.

Very few people will tell you that they don’t actually care about privacy.

“Consumers and manufacturers both want good security, but they may have more divergent views on privacy.”

But, increasingly, people will say that they don’t think there’s anything they can do to protect their privacy — or even that the things that might protect their privacy are pointless, because they won’t actually work.

So, while some talk about a privacy paradox — where people say they want privacy but don’t act on it — our research suggests that it isn’t really a paradox: People often have reasons for not acting that are not inconsistent with their desire for privacy.

C.J. Thompson is a New York writer.

Sources: