Press "Enter" to skip to content

Privacy Safeguards in Apple-Google Platform Could Be Abused, Experts Say

By Jeff Benson

Google and Apple’s COVID-19 platform announced Friday may be privacy-centric — but that doesn’t mean it can’t be abused, experts tell Digital Privacy News.

The tech giants behind the world’s two largest mobile-phone operating systems, Android and iOS, said in a rare joint announcement that they would build a Bluetooth-based platform to trace coronavirus. 

The system would enable phones within Bluetooth range to share data and log interactions.

If someone using the system tests positive for COVID-19 and chooses to submit their diagnosis to the system, users they’ve come in contact with will receive a notification (if they’ve opted into the system).

Apple and Google are only creating the bones; developers and public-health agencies would create applications and solve logistical hurdles.

Cellphone users, Apple and Google say, would explicitly have to opt into the system and individual users’ test results would not be shared with other people or the companies.

Still, a lot of things could go wrong.

The system’s security probably isn’t one of them, John Pacific, aka “tux,” a cryptography engineer at NuCypher, told Digital Privacy News.

“A privacy violation in this product would be extremely detrimental to (Apple and Google’s) image – especially now given the concern that everyone has (including folks’ demands for a private variant of contact tracing).”

Elizabeth Renieris, a privacy lawyer and fellow at Harvard’s Berkman Klein Center, agreed. 

“At the app-level, it ticks all of the boxes in terms of good privacy-by-design methods,” she said.

Yet Renieris and tux intimated that such systems could be letting the genie out of the bottle.

According to tux, the design is ethical so long as it’s based on “consensual interaction.” In other words, “the app only works if you agree to use it,” they said. 

But they’re concerned that growing acceptance of surveillance tools — “especially the privacy-by-design ones” — will normalize surveillance.

“Once people swallow this one, it’s easier for governments to give us other, far more dangerous pills,” they said. “What if the next one lacks a privacy-centric design?”

That slippery slope need not exist in the U.S. In fact, some privacy advocates have conceded that, especially when dealing with a pandemic, privacy should be balanced with other rights.

Still, one shouldn’t assume an opt-in system will remain that way in practice, especially with the inescapable Android and iOS systems now working in tandem.

Renieris pointed to China’s contact-tracing app, which leverages similarly ubiquitous Chinese platforms WeChat and Alipay.

“The app is now controlling much of post-quarantine life,” she told Digital Privacy News. 

“If you cannot produce a green code on your phone, you cannot get on public transportation, enter shops or hotels, go to work — or even re-enter your home in some places, among other restrictions.”

Importantly, Apple and Google already have a lot of information about us. The contact-tracing platform, though it ostensibly limits the ability of an app to collect location data, adds more potential for misuse.

Indeed, although Apple’s specifications say the system doesn’t require user location, it could still be collected if users provide consent.

“How they go about implementing that consent, behaviorally, could make a huge difference, in terms of how much location data is ultimately collected in connection with the app,” Renieris said.

Sure, Apple and Google are leading the way. But government involvement and civic engagement must keep technologists in check.

“The core issue,” Renieris noted, “is that we don’t have any clear laws in place to limit the harms or potential risks from something like this.”

Jeff Benson is a writer in Reno, Nev.


  • Apple Press Release (link)
  • (link)
  • Elizabeth Renieris (link)
  • The Atlantic (link)