-
Categories
-
Pharmaceutical Intermediates
-
Active Pharmaceutical Ingredients
-
Food Additives
- Industrial Coatings
- Agrochemicals
- Dyes and Pigments
- Surfactant
- Flavors and Fragrances
- Chemical Reagents
- Catalyst and Auxiliary
- Natural Products
- Inorganic Chemistry
-
Organic Chemistry
-
Biochemical Engineering
- Analytical Chemistry
-
Cosmetic Ingredient
- Water Treatment Chemical
-
Pharmaceutical Intermediates
Promotion
ECHEMI Mall
Wholesale
Weekly Price
Exhibition
News
-
Trade Service
In a paper published on the 20th in the journal Science, a team of bioengineers at the University of Pittsburgh Rehabilitation Neural Engineering Laboratory described a new application of brain-computer interface technology, that is, how to arouse the sense of touch by increasing brain stimulation, thereby It makes it easier for the operator to manipulate the mechanical arm controlled by the brain
.
This brain-computer interface not only relies on vision, but also imitates the sense of touch, which greatly improves the ability of patients with quadriplegia to manipulate objects with a brain-controlled robotic arm
.
A prosthetic device controlled by a brain-computer interface can measure brain activity related to movement from implanted electrodes and convert it into conscious control of the robotic arm, so that some paralyzed users can resume functional movement .
However, the use of brain-computer interface control systems is limited.
They usually only rely on visual cues and lack the key sensory feedback to feel the grasped object .
To solve this problem, the researchers added an incoming channel to the brain-computer interface to simulate the sensory input of the hand skin, thus forming a system that can both "read" and "write" information .
This two-way brain-computer interface can read the neural activity of the motor cortex of the brain to control the robotic arm .
At the same time, sensors on the "skin" of the manipulator record the mechanical forces it has experienced, and transmit them back to the somatosensory cortex through micro-simulation in the cortex, allowing the user to feel the tactile sensation, just like their real perception .
The participant in this study, Nathan Copland, was a 28-year-old man .
He took part in a test of sensorimotor microelectrode brain-computer interface after a car accident ten years ago, and implanted an electrode array in his body .
This time, with the new brain-computer interface, he provided tactile feedback through electrical stimulation, which greatly shortened the test time for his series of upper limb evaluations .
These assessments involve moving objects of different shapes, such as pouring paper and plastic fragments from the cup on the right of the table into the empty cup on the left . In all the experimental tasks, compared with the brain-computer interface that did not provide tactile feedback, the time required for him to grasp and transfer objects with the robotic arm was reduced by half, and the median time was reduced from 20.
9 seconds to 10.
2 seconds
.
The co-senior author of the study, Dr.
Robert Gunter, associate professor of the Department of Physical Medicine and Rehabilitation at the University of Pittsburgh, said: "Even if the recovery is limited and imperfect, people's performance will be greatly improved .
It is necessary to make the feeling more real.
We still have a long way to go to bring this technology to people’s homes, but the closer we are to reproducing the normal input of the brain, the better this technology will be .
” (Science and Technology Daily)
.
This brain-computer interface not only relies on vision, but also imitates the sense of touch, which greatly improves the ability of patients with quadriplegia to manipulate objects with a brain-controlled robotic arm
.
A prosthetic device controlled by a brain-computer interface can measure brain activity related to movement from implanted electrodes and convert it into conscious control of the robotic arm, so that some paralyzed users can resume functional movement .
However, the use of brain-computer interface control systems is limited.
They usually only rely on visual cues and lack the key sensory feedback to feel the grasped object .
To solve this problem, the researchers added an incoming channel to the brain-computer interface to simulate the sensory input of the hand skin, thus forming a system that can both "read" and "write" information .
This two-way brain-computer interface can read the neural activity of the motor cortex of the brain to control the robotic arm .
At the same time, sensors on the "skin" of the manipulator record the mechanical forces it has experienced, and transmit them back to the somatosensory cortex through micro-simulation in the cortex, allowing the user to feel the tactile sensation, just like their real perception .
The participant in this study, Nathan Copland, was a 28-year-old man .
He took part in a test of sensorimotor microelectrode brain-computer interface after a car accident ten years ago, and implanted an electrode array in his body .
This time, with the new brain-computer interface, he provided tactile feedback through electrical stimulation, which greatly shortened the test time for his series of upper limb evaluations .
These assessments involve moving objects of different shapes, such as pouring paper and plastic fragments from the cup on the right of the table into the empty cup on the left . In all the experimental tasks, compared with the brain-computer interface that did not provide tactile feedback, the time required for him to grasp and transfer objects with the robotic arm was reduced by half, and the median time was reduced from 20.
9 seconds to 10.
2 seconds
.
The co-senior author of the study, Dr.
Robert Gunter, associate professor of the Department of Physical Medicine and Rehabilitation at the University of Pittsburgh, said: "Even if the recovery is limited and imperfect, people's performance will be greatly improved .
It is necessary to make the feeling more real.
We still have a long way to go to bring this technology to people’s homes, but the closer we are to reproducing the normal input of the brain, the better this technology will be .
” (Science and Technology Daily)