Connect with us


Robots Use AI to ‘Feel’ Pain and Self-Repair



Image Credit: NTU Singapore

Robots are one step closer to being more like living beings with a new development within the field. Scientists from Nanyang Technological University, Singapore (NTU Singapore) have created an AI system that allows robots to recognize pain and self-repair. 

The newly developed system relies on AI-enabled sensor nodes, which process ‘pain’ and then respond to it. This pain is identified when there is pressure brought on by an outside physical force. The other major part of the system is self-repair. The robot is able to repair that damage, when the case is a minor ‘injury,’ all without having to rely on human intervention.

The research was published in August in the journal Nature Communications.

Most of the world’s current robots receive information about their immediate surroundings through a network of sensors. However, these sensors do not process information, but instead send the information to a central processing unit. This central processing unit is where learning takes place, and it means current robots are required to have many wires. This system results in longer response times. 

Besides longer response times, these robots are often easily damaged and require a lot of maintenance and repair. 

The New System

In the new system developed by the scientists, the AI is embedded into the network of sensor nodes. There are multiple smaller and less-powerful processing units, which the sensor nodes are connected to. This setup allows learning to take place locally, which in turn reduces the amount of wires needed and response time. Specifically, it is reduced five to ten times compared to conventional robots.

The self-repair system comes from the introduction of a self-healing ion gel material into the system. This allows the robots to recover mechanical functions when damaged, without the help of humans. 

Associate Professor Arindam Basu is co-lead author of the study. He comes from the School of Electrical & Electronic Engineering. 

“For robots to work together with humans one day, one concern is how to ensure they will interact safely with us. For that reason, scientists around the world have been finding ways to bring a sense of awareness to robots, such as being able to ‘feel’ pain, to react to it, and to withstand harsh operating conditions. However, the complexity of putting together the multitude of sensors required and the resultant fragility of such a system is a major barrier for widespread adoption.”

According to Basu, who is also a neuromorphic computing expert, “Our work has demonstrated the feasibility of a robotic system that is capable of processing information efficiently with minimal wiring and circuits. By reducing the number of electronic components required, our system should become affordable and scalable. This will help accelerate the adoption of a new generation of robots in the marketplace.” 

Teaching the Robot to Feel Pain

In order to teach the robot how to feel pain, the team relied on memtransistors, which act as ‘brain-like’ electronic devices. These devices are able to have memory and information processing, acting as artificial pain receptors and synapses. 

The study demonstrated how the robot could keep responding to pressure even after it had been damaged. Following an ‘injury,’ such as a cut, the robot loses mechanical function. That is when the self-healing ion gel kicks in and causes the robot to heal the ‘wound,’ basically stitching it together. 

Rohit Abraham John is first author of the study and Research Fellow at the School of Materials Science & Engineering at NTU.

“The self-healing properties of these novel devices help the robotic system to repeatedly stitch itself together when ‘injured’ with a cut or scratch, even at room temperature,” John says. “This mimics how our biological system works, much like the way human skin heals on its own after a cut.” 

“In our tests, our robot can ‘survive' and respond to unintentional mechanical damage arising from minor injuries such as scratches and bumps, while continuing to work effectively. If such a system were used with robots in real world settings, it could contribute to savings in maintenance.”

According to associate professor Nripan Mathews, who is co-lead author coming from the School of Materials Science & Engineering at NTU, “Conventional robots carry out tasks in a structured programmable manner, but ours can perceive their environment, learning and adapting behaviour accordingly. Most researchers focus on making more and more sensitive sensors, but do not focus on the challenges of how they can make decisions effectively. Such research is necessary for the next generation of robots to interact effectively with humans.”

“In this work, our team has taken an approach that is off-the-beaten path, by applying new learning materials, devices and fabrication methods for robots to mimic the human neuro-biological functions. While still at a prototype stage, our findings have laid down important frameworks for the field, pointing the way forward for researchers to tackle these challenges.”

The research team will now turn to partners in the industry and government research labs in order to further advance the system. 


Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.