A new law in California protects consumers’ brain data. Some think it doesn’t go far enough.

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. On September 28, California became the second US state to officially recognize the importance of mental privacy in state law. That pink, jelly-like, throbbing mass under your skull—a.k.a.…
A new law in California protects consumers’ brain data. Some think it doesn’t go far enough.

But some proponents of mental privacy aren’t satisfied that the law does enough to protect neural data. “While it introduces important safeguards, significant ambiguities leave room for loopholes that could undermine privacy protections, especially regarding inferences from neural data,” Marcello Ienca, an ethicist at the Technical University of Munich, posted on X.

One such ambiguity concerns the meaning of “nonneural information,” according to Nita Farahany, a futurist and legal ethicist at Duke University in Durham, North Carolina. “The bill’s language suggests that raw data [collected from a person’s brain] may be protected, but inferences or conclusions—where privacy risks are most profound—might not be,” Farahany wrote in a post on LinkedIn.

Ienca and Farahany are coauthors of a recent paper on mental privacy. In it, they and Patrick Magee, also at Duke University, argue for broadening the definition of neural data to what they call “cognitive biometrics.” This category could include physiological and behavioral information along with brain data—in other words, pretty much anything that could be picked up by biosensors and used to infer a person’s mental state.

After all, it’s not just your brain activity that gives away how you’re feeling. An uptick in heart rate might indicate excitement or stress, for example. Eye-tracking devices might help give away your intentions, such as a choice you’re likely to make or a product you might opt to buy. These kinds of data are already being used to reveal information that might otherwise be extremely private. Recent research has used EEG data to predict volunteers’ sexual orientation or whether they use recreational drugs. And others have used eye-tracking devices to infer personality traits.

Given all that, it’s vital we get it right when it comes to protecting mental privacy. As Farahany, Ienca, and Magee put it: “By choosing whether, when, and how to share their cognitive biometric data, individuals can contribute to advancements in technology and medicine while maintaining control over their personal information.”


Now read the rest of The Checkup

Read more from MIT Technology Review‘s archive

Nita Farahany detailed her thoughts on tech that aims to read our minds and probe our memories in a fascinating Q&A last year. Targeted dream incubation, anyone? 

There are lots of ways that your brain data could be used against you (or potentially exonerate you). Law enforcement officials have already started asking neurotech companies for data from people’s brain implants. In one case, a person had been accused of assaulting a police officer but, as brain data proved, was just having a seizure at the time.

EEG, the technology that allows us to measure brain waves, has been around for 100 years. Neuroscientists are wondering how it might be used to read thoughts, memories, and dreams within the next 100 years.