AI-driven neurotechnology ‘on steroids’ needs regulation, says Unesco

Receive free Neurotechnology updates

Neurotechnology is advancing so fast that it threatens human rights and requires global regulation, according to the UN’s scientific and cultural organisation.

Unesco will begin to develop a “universal ethical framework” for neurotechnology, which connects computers with the brain and increasingly uses artificial intelligence to analyse neural activity, at a conference of scientific and political leaders in Paris on Thursday.

“When you add AI, you are putting neurotechnology on steroids,” said Mariagrazia Squicciarini, lead author of a Unesco report on the rapid pace of innovation in neurotechnology.

Neurotechnology, including implants to diagnose and treat brain-related disorders, is beginning to improve the lives of people living with disabilities, but the increased investment in AI-based programs that can read people’s minds and store neural data has raised concerns about its use.

Gabriela Ramos, Unesco assistant director-general for social and human sciences, said: “The promise . . . may come at a high cost in terms of human rights and fundamental freedoms, if abused. Neurotechnology can affect our identity, autonomy, privacy, sentiments, behaviours and overall wellbeing.

“Developments that many thought were science fiction only a few years ago are here with us already and are poised to change the very essence of what it means to be human.”

The Unesco researchers estimate that private investments in neurotech companies such as Onward Medical and Elon Musk’s Neuralink increased more than 20-fold in the decade from 2010, reaching $7.3bn in 2020. The market for neurotech devices is projected to exceed $24bn by 2027.

The report analysed scientific publications and patents to examine the field’s swift expansion. The number of neuroscience papers rose from 57,899 in 2011 to 94,456 in 2021, while patents worldwide related to neurotech rose from 418 to 1,531 between 2010 and 2020.

One speaker at the Unesco conference is Rafael Yuste, director of the Neurotechnology Centre at New York’s Columbia University who is a leading neurobiologist and advocate of international regulation “to protect mental privacy”.

He pointed out that in four studies published within the past year, not all peer reviewed, “researchers have decoded speech and images from the brains of human volunteers, using non-invasive devices that didn’t need neurosurgery to insert”.

“All four incorporated advanced AI models to decode the brain data,” Yuste said. “The new algorithms will enable you to decode information that is highly sensitive — which makes the protection of mental privacy all the more urgent.”

Regulation is needed because “almost without exception neurotech companies in the US and Canada take complete ownership of the client’s neural data in their consumer user agreements”, he added. “We need to protect mental property — otherwise companies will start to bank brain data. They may not decode it today, but AI will enable them to decode it tomorrow.”

Yuste’s own laboratory has decoded and manipulated neural processing in the visual cortex of mice so that researchers can “implant hallucinations — make them see things that they are not really seeing. Manipulation of human brain activity in the future is something we should start discussing now. It opens the possibility of a new type of human being where part of our mental processing happens outside the body”.

Unesco’s Squicciarini said: “We are not against neurotechnology or asking for a moratorium on research, because it has huge potential to reduce the deaths and disabilities caused by neurological disorders.

“But we need a globally co-ordinated approach to regulation for neurotechnology not only in medicine but in the consumer market.”

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link