We Need Ground Rules on How to Keep Our Brain Data Private
There’s still no technology in the world that lets you listen in on someone’s thoughts. But scientific advances are making it easier than ever to measure, interpret, and reconstruct brain activity. Add that to a growing market of wearables with mind-reading sensors, and there are more ways to map our brainwaves than ever before.
With more opportunities to track brain activity comes more opportunities to mine that data. That’s not necessarily a bad thing outright, but it raises some privacy concerns: who owns brain data? fMRIs are already starting to get used as lie detectors, and it’s not unreasonable to expect police and other actors to use cognitive data in the future to gauge whether someone is innocent or guilty. It’s time to talk about how much control we should have over what’s in our brains.
At the World Science Festival, neuroethicist Paul Roote Wolp stressed how important it is right now to set up ground rules to protect cognitive privacy. Wolp believes that people should have absolute control over the information in our skulls, even with warrants for the contents of our brainwaves.
“I’m for an absolute right to cognitive privacy,” he says. “What does the right to privacy mean if it doesn’t mean the absolute right to the content of my own thoughts?”
It’s the early days, but technology that uses brainwaves is well on its way to becoming mainstream. Samsung has been prepping a mind-controlled tablet interface that uses an EEG hood since 2013, and there’s already a slew of devices that use neural-monitoring technology, from Emotiv’s high-tech research EEG headsets to Necomimi Brainwave Cat Ears.
When people use these technologies, the data footprints they leave behind will contain deeply personal information. EEG can be used as a unique personal identifier, for instance. With neuro-gaming and mind-controlled devices taking off, it’s time to have a discussion about the ways companies and governments can use the brain-based data these technologies generate. Sure, cat ears you move with your mind are whimsical, but if Ncomimi sells your brainwave data, that’s not so cute.
Aside from commercial uses, Wolp and others are concerned that government agencies and law enforcement will attempt to make cases against people based on what their brainwaves reveal. Right now, fMRI lie-detection is still in sketchy legal territory, but if the technology advances, there could be scenarios like the one described in the video, where a potential terrorist is “interrogated” by reading and analysing their brain waves.
Brainwave-reading technology is enormously valuable, and can help us understand how our brains work and how diseases of the brain work. It’s inevitable that we will continue developing these technologies. But it’s important not to forget that fighting for a reasonable expectation of privacy is necessary as we develop tools that could help people data-mine minds.