stub Brain-Machine Interface Could Assist Individuals With Paralysis - Unite.AI
Connect with us

Brain Machine Interface

Brain-Machine Interface Could Assist Individuals With Paralysis

Published

 on

An international team of researchers has developed a wearable brain-machine (BMI) device that could improve the quality of life for people with motor dysfunction or paralysis. It could even assist those with locked-in syndrome, which is when a person is unable to move or communicate despite being conscious.

The team was led by the lab of Woon-Hong Yeo at the Georgia Institute of Technology and included researchers from the University of Kent in the U.K. and Yonsei University in the Republic of Korea. The team combined wireless soft scalp electronics and virtual reality in a single BMI system. The system enables users to control a wheelchair or robotic arm just by imagining actions.

The new BMI was detailed in the journal Advanced Science last month.

A More Comfortable Device

Yeo is an associate professor in the George W. Woodruff School of Mechanical Engineering.

“The major advantage of this system to the user, compared to what currently exists, is that it is soft and comfortable to wear, and doesn't have any wires,” said Yeo.

BMI systems can analyze brain signals and transmit neural activity into commands, which is what enables the individuals to imagine actions for the BMI to carry out. ElectroEncephaloGraphy, or EEG, is the most common non-invasive method for acquiring the signals, but it often requires a skull cap with many wires. 

In order to use these devices, the use of gels and pastes are required to maintain skin contact, and all of this set-up is time consuming and uncomfortable for the user. On top of that, the devices often have poor signal acquisition due to material degradation and motion artifacts, which are caused by things like grinding teeth. This type of noise will appear in brain-data, and the researchers have to filter it out.

Machine Learning and Virtual Reality

The portable EEG system designed by the team improves signal acquisition thanks to the integration of interceptable microneedle electrodes with soft wireless circuits. In order to measure the brain signals, it is crucial for the system to determine what actions a user wants to perform. To achieve this, the team relied on a machine learning algorithm and virtual reality component. 

Tests carried out by the team involved four human subjects, and the next step is to test it on disabled individuals. 

Yeo is also Director of Georgia Tech’s Center for Human-Centric Interfaces and Engineering under the Institute for Electronics and Nanotechnology, as well as a member of the Petit Institute for Bioengineering and Bioscience. 

“This is just a first demonstration, but we're thrilled with what we have seen,” said Yeo.

Back in 2019, the same team introduced a soft, wearable EEG brain-machine interface, and the work included Musa Mahmood, who was the lead author of both that research and the new one.

“This new brain-machine interface uses an entirely different paradigm, involving imagined motor actions, such as grasping with either hand, which frees the subject from having to look at too many stimuli,” said Mahmood.

The 2021 study involved users demonstrating accurate control of virtual reality exercises with their thoughts, or motor imagery. 

“The virtual prompts have proven to be very helpful,” Yeo said. “They speed up and improve user engagement and accuracy. And we were able to record continuous, high-quality motor imagery activity.”

Mahmood says the team will now focus on optimizing electrode placement and more advanced integration of stimulus-based EEG.

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.