This New AI Brain Decoder Could Be A Privacy Nightmare, Experts Say
Artificial intelligence (AI) lets researchers interpret human thoughts, and the technology is sparking privacy concerns. The new system can translate a person’s brain activity while listening to a story into a continuous stream of text. It’s meant to help people who can’t speak, such as those debilitated by strokes, to communicate. But there are concerns that the same techniques could one day be used to invade thoughts.
“The time to think about how to appropriately protect mental privacy is now,” Jennifer Miller, a professor who studies ethics and medicine at the Yale School of Medicine, told Lifewire in an email interview. “It is best to engineer privacy protections into a technology at the outset rather than as an afterthought.”
An AI View of Brain Activity
A recent study published in the journal Nature Neuroscience described the new AI system called a semantic decoder. The work gets help from a transformer model similar to the ones that power Open AI’s ChatGPT and Google’s Bard.
Unlike other similar systems, the new technique does not require subjects to have surgical implants. Brain activity is measured using an fMRI scanner after extensive training of the decoder, in which the individual listens to hours of podcasts in the scanner. Later, the participants listen to a new story or imagine telling a story that allows the machine to generate corresponding text from brain activity alone.
This is a real leap forward compared to what’s been done before.
“For a noninvasive method, this is a real leap forward compared to what’s been done before, which is typically single words or short sentences,” Alex Huth, an assistant professor of neuroscience and computer science at UT Austin, said in a news release. “We’re getting the model to decode continuous language for extended periods of time with complicated ideas.”
Despite the promising research, no one will be reading your thoughts soon, David Bader, the director of the Institute for Data Science at New Jersey Institute of Technology, said in an email interview. He said that in the best case, the semantic decoder correctly outputs the gist of what you’re thinking about half the time. And you can easily defeat the semantic decoder by simply thinking about different things.
“This non-invasive procedure will require you to participate fully, and the results – your semantic decoder – will not work for anyone else, so there’s no risk that your thoughts will be read without your permission,” he added. “These fMRI machines cost upwards of $3 million and weigh 20 tons and can measure the microscopic blood flow in one’s brain, so we shouldn’t expect our thoughts to be read by a passer-by, our boss at the office, or a listening government ear.”
The Ethics of Brain Reading
Some observers say it’s not too soon to start thinking about how brain decoder technology might advance and ways to safeguard human thoughts.
“These techniques pose an existential threat to something we never had to worry about – our brain privacy,” Thomas P. Keenan, a professor at the University of Calgary who studies the implications of technology adoption, said via email. “Aside from cases of torture, the contents of our mind have always been ours and ours alone.”
Keenan pointed out that new research shows that, equipped with fMRI imaging and Artificial intelligence, much of our thinking can be decoded and verbalized.
In many ways, we can already tell or predict what someone is thinking based on existing data sources.
“It is hard to conceptualize “informed consent” when the reach of these technologies is, as of now, unknown and speculative,” he added. “Certainly, any forced use would be a form of “unreasonable search and seizure” and prohibited by law in most civilized countries.”
Perhaps a whole new profession will arise of coaches who teach people, such as criminal suspects, how to throw these technologies off the scent of their real thoughts, Keenan suggested.
“In many ways, we can already tell or predict what someone is thinking based on existing data sources, " Miller said. “Privacy, if not dead, is in the ICU. Thus, it becomes important not just to focus on privacy rights but also on specific protections from misuse of personal data and information. We want to make it hard for our thoughts to be used against us in unjust, predatory, or harmful ways.”
https://www.lifewire.com/why-the-new-ai-brain-decoder-could-be-a-privacy-nightmare-7488772