By Rifki Aria Nugraha
Brain-Computer Interface (BCI) technology could bring challenges to individual privacy, cybersecurity expert Pablo Ballarin Usieto told Digital Privacy News.
“If this data is not properly processed, the malicious can retrieve very valuable information about the person,” Usieto, co-founder of the Balusian cybersecurity firm in Spain, said of the technology.
It relies on devices that read a user’s brain activities, retrieving information from them.
The technology is premature, Usieto explained, and any mishandling could lead to abuse of confidential data regarding an individual’s health, personal preferences and emotions.
“We have to be very careful,” he said, “because we are generating very valuable information for free — and the use they do might not be very clear at all.”
Other privacy concerns include where the headphone-like devices store the collected information, how it is secured and what companies that collect the data do with it.
Reading Brain Activity
BCI helps people with neurological dysfunction communicate with their surroundings. A BCI device translates human thoughts by reading the activities of the user’s brain.

“If this data is not properly processed, the malicious can retrieve very valuable information.”
Pablo Ballarin Usieto, Balusian cybersecurity firm, Spain.
According to Usieto, BCI technology for the mass market collects information — processing it through an app before sending messages back to the user, whether it’s a physician, therapist or company collecting the data for a specific purpose.
The technology is being developed for medical and nonmedical use.
Growing Privacy Concerns
Experts say BCI can optimize the functionality of the human brain.
Researchers at Facebook, for instance, are developing a tool to help users convert their brain activity into texts.
Meanwhile, Neuralink Corp., a San Francisco neurotechnology company founded by Elon Musk, is experimenting with linking people and computers to help those with brain injuries.
But Tamara Bonaci, a Northeastern University professor in Boston who specializes in the security and privacy of emerging biomedical technologies, is wary of how such data can be secured effectively.
She fears BCI devices may give access not only to a user’s psychological information, but to private information like PINs and passwords, as well.
“Potential security threats that can be mounted against implanted neural devices were identified,” Bonaci co-wrote in a 2014 research paper, “thus, there is a growing need to address the potential privacy and security risks arising from the use of BCIs, in both medical and nonmedical applications.”
Bonaci did not return requests seeking comment from Digital Privacy News.
In addition, Marcello Ienca, a neuroethicist at the Swiss Federal Institute of Technology in Zürich, told Nature magazine last July that neuroscientists are carefully monitoring BCI developments to ensure privacy protection.
However, “We don’t want to be the watchdog of neuroscience or to police how neurotechnology should be developed,” Ienca stressed.
Following UK Law
Still, the privacy risks implementing BCI can be overcome, Javier Minguez, Usieto’s research collaborator, told Digital Privacy News.
He also is co-founder and chief technology officer at Bitbrain, a neurotechnology company also based in Spain.
First, anonymity during the data-collection process cannot be understated.
“The most-important aspect to address when acquiring the data is to comply with European GDPR, which is very strict,” Minguez said.

“The personal data always needs to be dissociated from the biomedical data.”
Javier Minguez, Bitbrain neurotechnology company, Spain.
Taking effect in May 2018, the U.K.’s General Data Protection Regulation sets limitations on what organizations can do with any personal data they collect across Europe.
“There are many points that we need to address, but one of the most relevant for privacy is that the personal data always needs to be dissociated from the biomedical data,” Minguez said.
He also recommended using two protection layers with the BCI technology, though with a single encryption.
The first layer would be the main server, where all data is stored. To ensure anonymity, an individual’s biometric and personal data should be encrypted and housed in different folders during this stage, Minguez explained.
The second layer would be a corporate cloud server, where only the biometric data — having been encrypted in the main computer — is stored.
Ienca admitted that while BCI was not yet mature, its use could be unpredictable. But he remained hopeful about the technology’s potential.
“It’s very hard to predict the outcomes of that technology,” he told Digital Privacy News. “But when the tech is mature, in terms of market size or deregulation, it can be too societally entrenched to improve it.”
Rifki Aria Nugraha is a writer based in Jakarta, Indonesia.
Sources (external links):
- Institute of Electrical and Electronics Engineers (IEEE):App stores for the brain: Privacy & security in Brain-Computer Interfaces