‘Mind reading,’ restoring vision to the blind and giving the deaf hearing could be possible: Neurosurgeon
Providing vision to the blind and hearing to the deaf could become possible with a breakthrough, AI-powered surgical procedure that could even make “mind reading” a reality, a California neuroscientist told Fox News.
Ann Johnson, a Canadian teacher who lost her ability to talk after a stroke left her paralyzed in 2005, was able to speak through a cloned version of her voice after undergoing a surgery that connected her brain to artificial intelligence. The procedure involved fixing over 250 electrodes to Johnson’s brain and connecting those to an array of computers through a port on the back of her head. Those, in turn, translated her brain activity into English using an AI-generated avatar that spoke on her behalf.
“AI is going to be applied across a lot of different applications, not just restoring function after paralysis,” said Dr. Eddie Chang, the chairman of neurological surgery at the University of California, San Francisco and leader of the team behind Johnson’s procedure. “I think some of the same tools are going to be allowing us to essentially help and restore vision, perhaps certain kinds of deafness.”
“We’re far from mind reading itself, but the potential is there,” he continued.
ARTIFICIAL INTELLIGENCE BREAKTHROUGH COULD SOON ENABLE MIND READING: NEUROSCIENTIST
WATCH MORE FOX NEWS DIGITAL ORIGINALS HERE
Chang led a team of scientists from his college and the University of California, Berkeley, which published a study last week demonstrating how to translate electronic brain signals into spoken language and facial expressions.
“We could be at a point where similar and related technologies will allow us to, quote-unquote, read the mind,” Chang said. The study brings researchers “one step closer to understanding how these kind of technologies are allowing us to read and interpret signals of the brain and mind.”
Scientists at the University of Texas at Austin published research in May that also supports the possibility of translating brain activity into language. The Texas team measured participants’ brain activity while listening to podcasts in a functional MRI scanner.
That data was fed to a computer, effectively teaching it how to interpret brain activity as streams of words. Then the participants listened to new stories in the scanner.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
Using the brain activity, the computer tried to recreate the new story and was able to either closely or exactly match the intended meaning of the original phrasing about half the time, according to the study.
“Even five to 10 years ago, we didn’t have the right AI tools in order to decipher the brain activity and translate it into words,” Chang told Fox News. The process of measuring “over tens of thousands of neurons” and translating that into English is “very complex.”
MEDICAL PROFESSIONALS UTILIZING AI TO JUDGE NARCOTICS PRESCRIPTIONS: REPORT
“That’s why AI has been so critical to our approach because it’s been very, very powerful to take those very subtle signals and translate them to things that are useful like words,” he continued.
Johnson’s ability to communicate improved drastically while using the AI avatar, according to the University of California study. She went from being able to communicate roughly 15 words a minute to nearly 80, according to Chang. Normal conversation speed is about 160 words per minute.
The research is “going to have profound implications for medicine and thinking about how the human brain works,” Chang told Fox News. It will “also have a lot of important ethical implications around privacy and more.”
“So this is the right time to start engaging and thinking through what we really want from this technology,” he said.
Click here to watch the full interview on how AI could enable mind reading.
Julia Musto contributed to this report.
Read the full article Here