Technology is coming for your brain data
10 mins read

Technology is coming for your brain data

Privacy advocates like Girardi worry that as the information in our skulls becomes available to companies, that data will be vulnerable to illicit profits. They argue that we are hurtling toward a future without guardrails or overseeing how the technology is developed or implemented. That’s why Girardi had his brain scanned with the Emotiv device and then sued the company to have the images and data removed from the company’s servers.

EEG: Now in a store near you

Girardi’s saga began in February 2022, when he purchased Emotiv’s Insight headset for nearly $500; the company describes the device as letting users “transform science fiction into reality.” (“EEG” stands for electroencephalogram, a test commonly performed by health care providers to scan the brain for disorders like epilepsy or stroke). Online reviews rave about the Insight’s range of uses—some of them sound more dubious than others. YouTubers and tech reviewers tout its ability to optimize the mind for “flow state” and meditation, or for detect the effects of certain diets on brain activity. On Twitch, you can watch videos of gamers supposedly playing Super Mario Bros. or Halo using only an EEG headset — with tags like “mind control” in the descriptions.

These reviews give an optimistic view of the technology, but in reality, EEG technology has been around for years and can only capture some information from the brain. “Brain-computer interfaces” is a general term for technologies that connect human brain activity to computers. They collect electrical signals from the brain and learn to identify patterns in the signals and translate them into actions, such as moving a mouse up or down on a computer screen. This is what futurists mean when they say that BCIs allow us to control computers with our minds. EEG is technically BCI, but it is much more limited than Insight advocates would have us believe.

In fact, EEG is “in some ways the worst brain measurement (tool),” says UC Berkeley neurobiology professor Jack Gallant. Even the most advanced devices can’t detect what kinds of information the brain is processing or what conclusions a person might draw, Gallant says.

But there’s a reason investment in consumer EEG is growing: The devices are relatively cheap and readily available, and no medical license is required to use them.

In addition, AI is starting to change the rules of the game. Developers are starting to use AI to find patterns in EEG brainwaves and draw increasingly sophisticated conclusions about individuals’ emotions and behaviors. Some researchers have been able to predict political ideologies based on brain data. In 2019, the Chinese government used a U.S.-made EEG headset called BrainCo to monitor the attention levels of 10,000 schoolchildren.

This worries experts such as Roberto Andorno, lawyer and associate professor of bioethics and biomedical law at the University of Zurich, Switzerland.

Brain data, he says, could be used for a variety of nefarious purposes: blackmail, manipulation, suppression of political dissent. “I imagine a totalitarian government would be very interested in what people thought of it,” Andorno says.

Andorno is also concerned about threats to workers’ rights.

Companies could require their employees to wear headsets during the workday—an expansion of other methods already used to monitor employees. Andorno says there have been cases where companies have tracked their employees’ concentration levels, and even whether they might be depressed. “Why,” he says, “would a company know that?” In Australia and the United States, some transport companies have introduced an EEG device designed by SmartCap Technologies to monitor driver fatigue. Access to this information creates the risk that companies will exploit and abuse it—for example, by invading employee privacy if a health condition is revealed during brain activity monitoring.

Girardi vs Emotiv

At this time Girardi used Emotiv’s Insight device, and the company’s policy was that users could request deletion of all their personal data from the Emotiv platform — except for EEG data.

Emotiv reserved the right to store and share EEG data with “third parties for scientific and historical research purposes.” Girardi was not fooled. In April 2022, he filed a lawsuit for constitutional protection with the Santiago Court of Appeal.

“We must never give up our rights, but we must enforce them. Human rights treaties are very clear on this,” says a member of his legal team, Natalia Monti. Brain data, he also argues, “is sensitive data that deserves maximum protection.” Emotiv appealed the request, arguing that users voluntarily agree to the company’s privacy policy and that any brain data used for scientific research anonymizes the user.

On May 24, 2023, the Santiago Court of Appeal ordered Emotiv to immediately delete all of Girardi’s EEG data, but also ruled that the company’s privacy policy was not unconstitutional in Chile. Girardi appealed to the Supreme Court of Chile, arguing that “there was a clear violation of the rights protecting psychological integrity,” Monti says. On August 9, 2023, the Supreme Court of Chile ruled in favor of Girardi. It ruled not only that Emotiv must delete Girardi’s brain data, but also that Emotiv must “refrain from selling the Insight device in Chile until it changes its privacy policy regarding the protection of brain data.”

The Emotiv team declined requests for comment for this story.

The ruling was the first of its kind, making Chile the first country in the world to legally regulate the use of brain data. Other governments — in Argentina, Brazil, Mexico and some U.S. states — are now looking at Chile’s case as they begin to implement their own brain data laws.

According to many neuroethicists, brain data is different from other biomedical data. Once it is combined with other types of data—including data from smartphone apps, social media interactions, voice recordings, and biometrics—it can be fairly easy to identify a person using machine learning and big data techniques, even if their brain data has been previously anonymized. For example, a research team at the Mayo Clinic in Rochester, Minnesota, was able to reconstruct people’s faces from MRI scans and identify participants from facial recognition software.

That said, many users believe the technology’s potential benefits outweigh such privacy risks, or they’re unaware of how companies might use their data. (Think of smartphones and social media — we all signed up before the harm became obvious.) That’s why, Andorno says, regulators should act before brain-computer interfaces like the Insight headset become a regular part of life: “The law has to prevent harm to people.”

And the clock is ticking.

It’s time to regulate the “Wild West” of brain data

According to the United Nations, private investment in BCIs increased more than 20-fold between 2010 and 2020, reaching $33.2 billion during that time. Governments are also pouring resources into developing the technology. In 2013, the Obama administration launched the BRAIN Initiative at the National Institutes of Health. Rafael Yuste, a neurobiologist at Columbia University who led the team that proposed the initiative, likened it to the Human Genome Project, whose primary goals are based on understanding the brain in order to develop health innovations that target neurological diseases. The initiative now has 550 labs across the country and an annual budget of about $900 million to support the development of neurotechnologies of all kinds, including ultrasound stimulation and optogenetics. Similar initiatives have emerged in countries such as China, Japan, South Korea, Canada and Israel.

Yuste also founded the Neurorights Foundation, an independent nonprofit that has launched a larger movement to raise ethical issues surrounding the use of neurotechnology and advocate for protections, especially against the sale and commercial use of brain data. Neurorights advocates like Yuste aren’t particularly concerned about medical neurotechnology, since it’s already regulated by patient privacy laws in most countries, such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States. But for nonmedical purposes, the world of brain data is a legal Wild West.

Several international governing bodies, including the Organization of American States and the United Nations, have issued human rights principles related to neurotechnology. In 2022, the UN Human Rights Council commissioned a report examining the human rights implications of neurotechnology. The report is due in September or October.

Advocates say regulation can’t come soon enough. The neurotechnology market is expected to be valued at more than $55 billion by 2033. Regulation will set limits on what private companies and governments can do with individuals’ brain data. Fast action, Yuste says, is crucial.

But efforts to regulate have met resistance from the private sector: At the time Girardi bought the Insight device, Emotiv’s privacy policy considered brain data to be personal information. But the company’s updated policy, published in June 2023, states that “EEG data itself is not personal information.”

But Yuste says it couldn’t get more personal. “The brain,” he says, “is not just another organ. It encompasses all of our cognitive abilities, every thought, every decision. . . . It is the essence of our humanity.”

Jose M. Muñoz served as an expert witness for the Supreme Court of Chile in the landmark Girardi v. Emotiv case. At the time of this story, he was a postdoctoral fellow at the Kavli Center for Ethics, Science, and the Public at the University of California, Berkeley. Follow him on X @jmmunoz_.

Laura Isaza writes about science, the environment, and sports. She has written for NPR’s All Things Considered, KALW, Atmos, The Dirtbag Diaries, and Planet Forward.

Tarini Mehta is a California-based reporter who focuses on health, housing and governance. She has previously written for The Mercury News, India Today, The Diplomat, The Hindu and The Print.