Connect with us

Robotics

Scientists Develop Smart Artificial Hand Combining User Control and Automation

Updated

 on

Scientists from Ecole Polytechnique Fédérale de Lausanne are working on new ways to improve the control of robotic hands, especially for amputees. They have developed a way to combine individual finger control and automation to help improve grasping and manipulation. They tested this idea of neuroengineering and robotics on three different amputees and seven healthy people. The results of the study were published in Nature Machine Intelligence

This newly developed technology combines two separate fields for robotic hand control. This is something that has not been done before, and it is following the new field of shared control in neuroprosthetics.

One of the new concepts comes from neuroengineering. The intended finger movement is identified by reading the muscular activity on the amputee’s stump. This is then used for individual finger control of the prosthetic hand. The other concept comes from robotics. The robotic hand is able to grab objects and keep in contact with them by grasping. 

“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard, who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.”

The process starts by the algorithm learning how to decipher the user’s intention, and it then translates that into finger movement of the prosthetic hand. In order for this to happen, the amputee first has to train the algorithm that uses machine learning by performing a series of hand movements. Sensors are used on the amputee’s stump, and they can detect certain muscular activity. The algorithm then learns and connects the hand movements and their corresponding muscular activity. Eventually, the algorithm will know the user’s intended finger movements, and then the individual fingers can be controlled on the prosthetic hand. 

Katie Zhuang is the first author of the publication. She spoke about the machine learning algorithm. 

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” she said.

The scientists then went on to engineer the algorithm so that when a user tries to grasp an object, robotic automation is initiated. The algorithm will relay to the prosthetic hand to close its fingers and grasp when an object comes in contact with sensors. The sensors are located on the surface of the prosthetic hand. The scientists created this new system based on an adaptation from a previous study. In that study, robotic arms were designed to identify the shape of objects and then grasp them. They did this based solely on tactile information, and there was no reliance on visual signals. 

There are still challenges ahead before this technology can be effectively used among people and become a commercially viable option for amputees looking for prosthetic hands. However, this technology is a huge step forward in the field, and it will continue to push the idea of merging human and robotics. As of right now, the algorithm is still being tested on a robot.

“Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” says Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna.

 

Alex McFarland is a Brazil-based writer who covers the latest developments in artificial intelligence & blockchain. He has worked with top AI companies and publications across the globe.