Wanted: Your Respiratory Data, But Not Without Privacy Risks

By Asa Hiken

In through the nose, out through the mouth — and into the hands of private companies. 

This could be the new fate of your breath. 

So it appears from an incipient trend in consumer tech, in which companies are creating products that target the collection of respiratory data. 

For instance, at the all-virtual 2021 Consumer Electronics Show in January, the Industrial Technology Research Institute (ITRI) showcased a clothing textile that tracked a wearer’s respiratory rate — along with other physiological conditions.

Similarly, BioIntelliSense debuted a clip-on device that monitors COVID-19 symptoms, notably irregularities in respiration. 

Finally, AirPop introduced what it touted as the world’s first smart “air-wearable,” a face mask that supposedly can track all types of respiration-based processes — from breath count to breath cycles to the volume of air being inhaled. 

Together, these products reflect a growing commercial interest in respiratory data, at least in part encouraged by the medical needs of COVID-19.

But experts told Digital Privacy News that divulging this data expanded a user’s susceptibility to aggregation, or the accumulation of separate data points — through sharing practices — into a detailed data set. 

Digital Ecosystem

Experts warned that, with the introduction of respiratory data into the digital ecosystem, these aggregated data sets could more easily face a raft of privacy violations.  

These could range from invasive advertising to re-identification of de-identified data to the producing of highly accurate inferences about your most sensitive particulars — whether or not you’ve revealed them. 

“The more they know about you, the more ways they can figure out how to exploit you.”

Alaap Shah, Epstein Becker Green.

With aggregation, companies are “taking some data set and sharing it with a third party — and that third party starts to link that data with other data,” Alaap Shah, a privacy and cybersecurity lawyer with Epstein Becker Green in Washington, told Digital Privacy News. 

“The more they know about you, the more ways they can figure out how to exploit you.”  

‘Regulatory Silos’ 

According to Shah, the risks of aggregation involving respiratory data are made possible by the lack of legal restraint enjoyed by private wellness companies. 

While the main line of defense for health information in the U.S. is the federal Health Insurance Portability and Accountability Act (HIPAA), Shah noted that the law only extended to healthcare providers and health plans — not wearables and apps. 

The result is what Quinn Grundy, a nursing professor at the University of Toronto, called “regulatory silos,” through which private companies can legally share user data without strict liability. 

“HIPAA was never really designed for app companies,” she told Digital Privacy News. “If you’re looking into a particular device that pairs with an app, the first question should be: ‘Is that system regulated as a medical device?’” 

No Response From Firms

Digital Privacy News posed this question to ITRI, BioIntelliSense and AirPop — each of whose products were synced to mobile apps. Neither returned requests for comment. 

But, according to Shah, even if each product was assumed to be an official medical device, their manufacturers still could exploit loopholes in the law. 

Most notable is HIPAA’s de-identification caveat, which gives entities the option to strip out key identifiers from data, such as name and address. 

This may sound reassuring at first, Shah suggested: Your respiratory data is shared — but, theoretically, it no longer can be traced back to you.

Yet, when companies shed identifiable information, they also are effectively shedding all the protections that HIPAA serves to provide, he said.

“The data resulting from (this) exercise is no longer subject to HIPAA, so you’re free and clear to use that de-identified data outside of the guardrails of HIPAA’s privacy rule.”

The result, Shah concluded, is a license for companies to aggregate data freely and openly. 

“They can share it with third parties without HIPAA saying anything about it,” he told Digital Privacy News. “They can save it in databases without having all the bells and whistles around security.

“There’s a lot of risk.”

‘Algorithmic Sleuthing’  

So, what are the specific dangers of aggregating respiratory data?  

Depending on the number of data points a person already has surrendered, Shah explained that aggregators could compile that information with available respiratory data to form a detailed online persona. 

“If you’re looking into a particular device that pairs with an app, the first question should be: ‘Is that system regulated as a medical device?’”

Quinn Grundy, University of Toronto.

This “persona” then could be exploited in many ways, from discriminatory insurance practices to online impersonation to full-fledged ransom. 

Aggregators “can triangulate data points that relate to you,” Shah said, “such that all of a sudden, they have this framework for who you are — and then anytime they get a signal that relates to any of those pieces, they lump it into the dataset.” 

Toronto’s Grundy, who researches accountability in the pharmaceutical sector, noted the scale of this activity — saying it could lead to “microtargeting … invasive advertising” and other “societal and community-level harms.” 

Shah called these “archetypes” — which also could be used to modify group behavior or sold to third parties — “algorithmic sleuthing.”

“A really sophisticated person — maybe not even so sophisticated — could start to identify people,” he told Digital Privacy News. 

Grundy summed up the risk this way: “The respiratory data that you share in one face mask is another bread crumb that you’ve left within the digital ecosystem.” 

Asa Hiken is a Washington writer.