Connect with us

Brain Machine Interface

AI Brings New Potential for Prosthetics with 3D-Printed Hand

Updated

 on

A new 3D-printed prosthetic hand paired with AI has been developed by Biological Systems Lab at Hiroshima University in Japan. This new technology can dramatically change the way prosthetics work. It is another step in the direction of combining both the physical human body with artificial intelligence, something that we are most definitely heading towards. 

The 3D-printed prosthetic hand has been paired with a computer interface to create the lightest and cheapest model yet. This version is the most reactive to motion intent that we have seen. Before the current model, they were normally made from metal which caused them to be both heavier and more expensive. The way this new technology works is by a neural network that is trained to recognize certain combined signals, these signals have been named “muscle synergies” by the engineers working on the project. 

The prosthetic hand has five independent fingers that can make complex movements. Compared to previous models, these fingers are able to move around more as well as all at the same time. These developments make it possible for the hand to be used for tasks like holding items such bottles and pens. Whenever the user of the technology wants to move the hand or fingers in a certain way, they only have to imagine it. Professor Toshio Tsuji of the Graduate School of Engineering at Hiroshima University explained the way a user can move the 3D-printed hand. 

“The patient just thinks about the motion of the hand and then the robot automatically moves. The robot is like a part of his body. You can control the robot as you want. We will combine the human body and machine like one living body.”

The 3D-printed hand works when electrodes in the prosthetic measures electrical signals that come from nerves through the skin. It can be compared to the way ECG and heart rates work. The measured signals are then sent to a computer within five milliseconds at which point the computer recognizes the desired movement. The computer then sends the signal back to the hand. 

There is a neural network that helps the computer learn the different complex movements, it has been named Cybernetic Interface. It can differentiate between the 5 fingers so that there can be individual movements. Professor Tsuji also spoke on this aspect of the new technology.

“This is one of the distinctive features of this project. The machine can learn simple basic motions and then combine and then produce complicated motions.”

The technology was tested among seven people, and one of the seven was an amputee who has been wearing a prosthesis for 17 years. The patients performed daily tasks, and they had a 95% accuracy rate for single simple motion and a 93% rate for complex movements. The prosthetics that were used in this specific test were only trained for 5 different movements with each finger; there could be many more complex movements in the future. With just these 5 trained movements, the amputee patient was able to pick up and put down things like bottles an notebooks. 

There are numerous possibilities for this technology. It could decrease cost while providing extremely functional prosthetic hands to amputee patients. There are still some problems like muscle fatigue and the capability of software recognizing many complex movements. 

This work was completed by Hiroshima University Biological Systems Engineering Lab along with patients from the Robot Rehabilitation Center in the Hygo Institute of Assistive Technology, Kobe. The company Kinki Gishi was responsible for creating the socket which was used on the arm of the amputee patient. 

 

Alex McFarland is a historian and journalist covering the newest developments in artificial intelligence.