Stanford scientists find a new breakthrough in decoding human speech
-
Last Update: 2020-01-02
-
Source: Internet
-
Author: User
Search more information of high quality chemicals, good prices and reliable suppliers, visit
www.echemi.com
In 2020, the year often appeared in science fiction as a child, unexpectedly, really, it's coming!! Cars fly in the sky, robots do housework, clothes adjust temperature automatically, some people immigrate to Mars, brain wave transmission consciousness These have been "set" in the 2020 world Although science fiction hasn't completely become a reality, we are undoubtedly witnessing the birth of many new technologies, some of which have made progress even beyond our imagination Brain computer interface (BCI) is a kind of science fiction technology It can realize the information exchange between external devices and brain by interpreting the signals of neuroelectric activity Recently, neurosurgery experts and engineers from Stanford University jointly made unexpected discoveries in the field of brain computer interface, which overturned people's understanding of the brain in the past decades, and also provided a new way for researchers to help people who have lost the ability to speak recover speech in the future In the 1930s, neurosurgeon Wilder Penfield and colleagues proposed a model to explain how the brain controls movement In a long area of the brain across the top of the head, namely the motor cortex, different areas control the actions of different parts such as hands, legs, faces, etc Later research gradually made the simplified model more complex, for example, it was found that the brain regions responsible for the subdivision of fingers, palms and other parts had some overlap But in general, as we often see in the textbook of neuroscience, "motor villain" model shows: the main parts of the human body are controlled by different areas of the cortex In a clinical trial launched ten years ago, scientists studying brain computer interface began to implant special sensors in the cerebral cortex of some volunteers, read the neuron signals of specific brain areas, and use algorithms to convert the signals into actions, in order to let these paralyzed volunteers control computers and artificial limbs and other devices by "mind" In the brains of several of the participants, sensors were placed in an area of their motor cortex called the "hand knot." This area has been thought to be related to hand and arm movements in the past In 2017, Professor Krishna Shenoy, an electrical engineer at Stanford University, and Professor jaimie Henderson, an expert in neurosurgery, jointly published a significant progress of the project: they used this brain computer interface to decode the signals of "moving hands and arms" of neurons in the brain area, enabling several paralyzed people to successfully achieve accurate and fast mind typing! When the research team led by Professor Shenoy and Professor Henderson continued to decode the neural signals in the brains of these volunteers, there was an unexpected new discovery! Dr stavisky, the first author, said that two patients in the study were quadriplegic due to spinal cord injury, but could still speak As a result, the researchers were able to look at the neural activity in the areas related to hand activity in the motor cortex when volunteers spoke loudly "It's a typical 'I don't know what's going to happen' study," Professor Shenoy said, "but we said: try it." As a result, Dr stavisky and colleagues found that after giving the "start talking" prompt, the neuron activity of volunteers also changed significantly These neurons were supposed to be active when controlling hand and arm movements, but unexpectedly they also became active when volunteers spoke Not only that, when volunteers make different sounds, the activity patterns of these neurons are also different The researchers provided volunteers with a list of 10 words to record their neural signals as they spoke each word By analyzing the patterns of neural activity, the researchers could identify which words the volunteers were saying, with an accuracy of 85% and 55% in two volunteers! Dr stavisky said they next wanted to record neural activity in this area of the brain as volunteers spoke longer sentences and paragraphs Then, they decoded the neural signals to reproduce what the volunteers said Another well-known scientist in the field of brain computer interface, Professor Edward Chang of the University of California, San Francisco, commented, "this paper makes me very excited It raises the question of how exclusive is the allocation of functions to specific areas of the brain I think it's something we didn't fully realize in the past This also means that the area of motor cortex related to hand movements may be an unknown breakthrough in the past when people who are unable to speak are asked to speak again Based on the findings, researchers hope to build a medical device implanted into the brain in the future to help people who have lost their speech ability recover their speech ability Referring to when the device is expected to become a reality, Professor Shenoy said: "I think we can see something in the next 10 years." We are looking forward to the next 10 years of progress and breakthroughs, so that science fiction into reality.
This article is an English version of an article which is originally in the Chinese language on echemi.com and is provided for information purposes only.
This website makes no representation or warranty of any kind, either expressed or implied, as to the accuracy, completeness ownership or reliability of
the article or any translations thereof. If you have any concerns or complaints relating to the article, please send an email, providing a detailed
description of the concern or complaint, to
service@echemi.com. A staff member will contact you within 5 working days. Once verified, infringing content
will be removed immediately.