stub Scientists Bring Extreme Fingertip Sensitivity to Robots - Unite.AI
Connect with us

Robotics

Scientists Bring Extreme Fingertip Sensitivity to Robots

Published

 on

A team of scientists at the Max Planck Institute for Intelligent Systems (MPI-IS) have introduced a robust soft haptic sensor that relies on computer vision and a deep neural network to estimate where objects come into contact with the sensor. It can also estimate how large the applied forces are.

The new research, which was published in Nature Machine Intelligence, will help robots sense their environment as accurately as humans and animals.

Thumb-Shaped Sensor With Skeleton

The sensor is in the shape of a thumb and made out of a soft shell built around a lightweight skeleton. The skeleton acts the same way as bones to stabilize the soft finger tissue, and it is made up of an elastomer mixed with reflective aluminum flakes. This creates a grayish color that prevents external light from entering. Inside the finger is a 160-degree fish-eye camera that records the colorful images illuminated by LEDs.

The appearance of the color pattern inside the sensors changes depending on the object that touches the sensor’s shell, and the camera quickly records images and feeds the deep neural network data.

Each little change in light in each pixel is detected by the algorithm, and within a fraction of a second, the machine-learning model maps out where the finger is coming into contact with an object. It also determines the strength of the forces and the force direction.

Georg Martius is a Max Planck Research Group Leader at MPI-IS and head of the Autonomous Learning Group.

“We achieved this excellent sensing performance through the innovative mechanical design of the shell, the tailored imaging system inside, automatic data collection, and cutting-edge deep learning,” Martius says.

Huanbo Sun is Martius’ Ph.D. student.

“Our unique hybrid structure of a soft shell enclosing a stiff skeleton ensures high sensitivity and robustness. Our camera can detect even the slightest deformations of the surface from one single image,” Sun says.

According to Katherine J. Kuchenbecker, Director of the Haptic Intelligence Department at MPI-IS, the new sensors will prove extremely useful.

“Previous soft haptic sensors had only small sensing areas, were delicate and difficult to make, and often could not feel forces parallel to the skin, which are essential for robotic manipulation like holding a glass of water or sliding a coin along a table,” says Kuchenbecker.

Fingertip Sensitivity for Robots - a publication in Nature Machine Intelligence

Teaching the Sensor to Learn

For the sensor to learn, Sun developed a testbed that generates the training data for the machine-learning model to learn. This data helps the model understand the correlation between the change in raw image pixels and the applied forces. Around 200,000 measurements were generated from the testbed probing the sensor around its surface, and the model was trained in one day.

“The hardware and software design we present in our work can be transferred to a wide variety of robot parts with different shapes and precision requirements. The machine-learning architecture, training, and inference process are all general and can be applied to many other sensor designs,” Huanbo Sun says.

Alex McFarland is a tech writer who covers the latest developments in artificial intelligence. He has worked with AI startups and publications across the globe.