stub Researchers Bring Sense of Touch to Robotic Finger - Unite.AI
Connect with us

Robotics

Researchers Bring Sense of Touch to Robotic Finger

Published

 on

Researchers at Columbia Engineering have brought a sense of touch to a newly developed robotic finger. It is able to localize touch with extremely high precision over large, multicurved surfaces. The new development puts robotics one step closer to reaching human-like status. 

Matei Ciocarlie is an associate professor in the departments of mechanical engineering and computer science. Ciocarlie led the research in collaboration with Electrical Engineering Professor Ioannis (John) Kymissis. 

“There has long been a gap between stand-alone tactile sensors and fully integrated tactile fingers — tactile sensing is still far from ubiquitous in robotic manipulation,” says Ciocarlie. “In this paper, we have demonstrated a multicurved robotic finger with accurate touch localization and normal force detection over complex 3D surfaces.”

The current methods that are used to integrate touch sensors into robot fingers face many challenges. It is difficult to cover multicurved surfaces, there is a high wire count, and difficulty fitting the sensors into small fingertips, which prevents the use in dexterous hands. The Columbia Engineering team got around these challenges by developing a new approach: they used overlapping signals from light emitters and receivers that are embedded in a transparent waveguide layer covering the functional areas of the finger. 

The team was able to obtain a signal data set that changes in response to deformation of the finger due to touch. They did this by measuring light transport between every emitter and receiver. Useful information, such as contact location and applied normal force, was then extracted from the data through the use of data-driven deep learning methods. The team was able to do this without the use of analytical models. 

Through this method, the research team developed a fully integrated, sensorized robot finger that has a low wire count. It was built through the use of accessible manufacturing methods and can be easily integrated into dexterous hands. 

The study was published online in IEEE/ASME Transactions on Mechatronics

The first part of the project was the use of light to sense touch. There is a layer of transparent silicone underneath the “skin” of the finger, and the team shined light into it from more than 30 LEDs. The finger also has over 30 photodiodes that are responsible for measuring how the light bounces around. As soon as the finger comes into contact with something, the skin deforms and light moves around in the transparent layer underneath the skin. The researchers then measure how much light goes from every LED to every diode in order to come up with about 1,000 signals. Each one of those signals contains information about the contact made.

“The human finger provides incredibly rich contact information — more than 400 tiny touch sensors in every square centimeter of skin!” says Ciocarlie. “That was the model that pushed us to try and get as much data as possible from our finger. It was critical to be sure all contacts on all sides of the finger were covered — we essentially built a tactile robot finger with no blind spots.”

The second part of the project was the team designing the data to be processed by machine learning algorithms. The data is extremely complex and cannot be interpreted by humans. However, current machine learning techniques can learn to extract specific information, such as where the finger is being touched, what is touching the finger, and how much force is being applied. 

“Our results show that a deep neural network can extract this information with very high accuracy,” says Kymissis. “Our device is truly a tactile finger designed from the very beginning to be used in conjunction with AI algorithms.”

The team also designed the finger so that it can be used on robotic hands. The finger is able to collect nearly 1,000 signals, but it only requires one 14-wire cable connecting it to the hand. There are also no complex off-board electronics required for it to function. 

The team currently has two dexterous hands that are being integrated with the fingers, and they will look to use the hands to demonstrate dexterous manipulation abilities.

“Dexterous robotic manipulation is needed now in fields such as manufacturing and logistics, and is one of the technologies that, in the longer term, are needed to enable personal robotic assistance in other areas, such as healthcare or service domains,” says Ciocarlie.

 

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.