Health Guidelines Seek to Protect Data Not Covered by HIPAA Law

By Myrle Croasdale

The “Wild West” of unprotected personal health data may be nearing an end.

Two organizations, the eHealth Initiative (eHI) and the Center for Democracy and Technology (CDT), have proposed voluntary privacy standards to protect consumer-generated health information not covered by the Health Insurance Portability and Accountability Act (HIPAA).

“With the rise of wearable devices, wellness apps and other online services, huge amounts of information reflecting users’ health are being created and held by entities that are not bound by HIPAA regulations,” Alexandra Reeve Givens, CDT’s president and CEO, told Digital Privacy News.

“We hope this framework serves as a first step to providing greater privacy rights and protections for consumers.”

Along with setting privacy standards, the draft framework recommends a self-regulatory enforcement model to hold participating organizations accountable.

Privacy experts, however, questioned the effectiveness of such an approach. 

Public Comment Period

The proposed framework from the Washington-based organizations is open for public comment through Friday.

It was financed by a grant from the Robert Wood Johnson Foundation. A committee of health care providers, technology companies, universities and advocacy organizations also contributed input.

“Huge amounts of information reflecting users’ health are being created and held by entities who are not bound by HIPAA regulations.”

Alexandra Reeve Givens, Center for Democracy and Technology.

The eHI and CDT called the framework an interim measure that did not replace the need for federal protections for consumer-generated health data.

Central elements of the draft include transferring the privacy risk to companies collecting the information and defining exceptions for using de-identified data.

Privacy experts agreed that the companies collecting the data should own the privacy risk, but they questioned the organizations’ assumption that private data could successfully be de-identified.

“It’s quite aspirational to say that data will be de-identified,” Kayte Spector-Bagdady, a bioethicist and lawyer at the University of Michigan Medical School in Ann Arbor, told Digital Privacy News. “But once data goes out the door, data are extremely hard to control.”

HIPAA Provisions

Under HIPAA, only certain “entities” — doctors, health insurers, hospitals and their business associates — are required to protect patient data privacy, according to the Department of Health and Human Services.

No national privacy standards protect health data that is collected, held or shared by businesses not covered by federal law.

The eHI and CDT said that the framework was intended as a temporary measure and that federal privacy regulations still were necessary.

“Momentum is building for new federal privacy legislation, but currently no bills have made significant progress toward being enacted into law,” said Jennifer Covich Bordenick, eHI’s CEO. “As we wait for a comprehensive law, we can and should do more to better protect consumer privacy in the interim.” 

What Companies Would Do

The draft framework proposes that companies agreeing to follow the standards will:

  • Clearly identify the types of health information to be collected.
  • Clearly specify why health information is being collected.
  • State why health information is disclosed.
  • State if any health information will be disclosed and provide the names of all business and services that will receive, license or purchase consumer-health information.
  • Notify consumers when policies and practices on how their health information will be collected, disclosed or used have changed.
  • Provide consumers with a description of individual rights and a detailed list of consumer controls that a participating company has made available.

— Myrle Croasdale

Laws for the Lawless

Spector-Bagdady called the framework a definite step forward for consumer privacy.  

“First, I would say I was impressed,” she told Digital Privacy News. “These are high standards compared to what many entities are currently following.”

The increase in greater privacy rights begins with putting the responsibility for protecting the data on the companies collecting and using it, she argued.   

The proposed framework states that it “goes beyond outdated models that revolve primarily around notice and consent.

“While such laws or frameworks may have made sense in decades past, people can no longer make informed and timely decisions about all the different websites, apps and devices they use every day.

“By putting clear restrictions on the collection, use and sharing of data, the draft shifts the burden of privacy risk off of users,” the framework states.

Once consumers give their consent, the data-collecting company “must seek additional consent for any new collection, disclosure or use of consumer-health information outside the scope of any previous consumer consent,” according to the document.

“Many privacy scholars now admit data can never be completely de-identified.”

Kayte Spector-Bagdady, University of Michigan.

Consumers also can revoke this consent at any time.

De-Identification Dilemma

The proposed structure also allows for limited sharing of data for specific research purposes, but only after it has been de-identified.

This raises a host of issues, Spector-Bagdady said.

“They can use it for research without consent if it is not identifiable,” she told Digital Privacy News. “Many privacy scholars now admit data can never be completely de-identified.

“It can be combined with other data and re-identified by bringing those data sets together.”

When de-identified health information is combined with genomic data, geolocation information or social media data, re-identification is possible, she said.

Another problem is that the standards allow third parties to access de-identified data, Spector-Bagdady said. 

“You don’t have control over how it is used in the future,” she said, adding, “that’s not the fault of the framework — but it’s a major limitation.”

Michelle De Mooy, a privacy and data ethics consultant in Washington, also expressed concerns that data transfers would put privacy at risk.

“I don’t think self-regulation is the right approach, but it’s a practical approach.”

Michelle De Mooy, data-ethics consultant, Washington.

“Cleary, they are trying to find a way to protect people,” she said. But “companies can comply with data de-identification rules and still expose it to identification.

“It’s easy to re-identify people using sophisticated AI on data sets,” De Mooy said. “I don’t see that (issue) reflected in this framework.

“It would be meaningful to have that addressed.”  

Other Issues

Another concern in the draft framework, De Mooy told Digital Privacy News, is the use of consumer health-related data to benefit public health. 

For example, COVID-19 contract-tracing apps would need access to geolocation data and personal health information, she said.

Outlining what would be an acceptable use of private-health information during such an emergency would add value to the framework, she said.

“There could be an entire section on sharing data for public-health purposes,” De Mooy said. “It’s a missed opportunity to get into detail about that.”

Joseph Jerome, an independent data-privacy expert in the District, pointed out that another gap in the privacy outline was recommendations on data access by law enforcement.

Pacemaker data has been used in a criminal context, as has information from genetic-testing services.

Jerome said he’d like to see the framework address whether authorities should have access to data without consumer consent.

“It’s disappointing the lack of discussion of law-enforcement access to this information,” he said.

Keeping Consumer Trust

But enforcement of the data-privacy framework and how participating companies would be held accountable will have the biggest impact, Jerome said.

To keep consumer trust, participating companies will need to be accountable.

The framework supports a self-regulatory model that could include third-party audits. The audits would allow “these agencies to focus their resources on bad actors who would not otherwise be compelled to act in pro-privacy ways,” he said.

To make self-regulation effective, Jerome said members would need to finance an “independent body that is radically transparent.”

So far, he hasn’t seen self-regulating industries accomplish this. Members who make mistakes would need more than a corrective action plan with a bare-bones checklist audit, he said, otherwise significant change will not occur and violations will be missed.

“It’s all pretty words on paper until you see how it will be followed up.”

Joseph Jerome, data-privacy expert, Washington.

“It’s all pretty words on paper until you see how it will be followed up,” he told Digital Privacy News.

Given that following the framework’s standards would be voluntary, pursuing a more aggressive approach likely would limit participation.

“I don’t think self-regulation is the right approach, but it’s a practical approach,” said De Mooy, the privacy and data-ethics consultant. “I appreciate that.”

How to Comment

Submit comments on the draft framework here by Sept. 25.

Read the draft framework here.

Watch the webinar recording here and the slides here.

For additional information visit here.

Myrle Croasdale is a Minnesota writer.

Sources (external links):