Connect with us

Robotics

Researchers Create Legged Robot That Can Hike Difficult Terrain

Published

 on

Image: Takahiro Miki

A team of researchers at ETH Zurich have developed a new approach that enables a legged robot to move quickly over complex terrain. The robot, named ANYmal, relies on machine learning to combine visual perception of the environment and sense of touch. 

The quadrupedal robot was able to hike 120 vertical meters within 31 minutes, which is four minutes faster than the estimated duration for human hikers with no missteps. 

Brand New Technology 

The technology that enables ANYmal to combine the visual perception and sense of touch is brand new. 

The team was led by Marco Hutter, and the research was published in the journal Science Robotics

“The robot has learned to combine visual perception of its environment with proprioception — its sense of touch — based on direct leg contact. This allows it to tackle rough terrain faster, more efficiently and, above all, more robustly,” Hutter says.

The team says that the robot will eventually be capable of being deployed anywhere that is too dangerous for humans or otherwise impossible for different types of robots to maneuver.

Humans and animals also combine the visual perception of their environment with sense of touch from their legs and hands, which enables them to handle tough terrain. Previously developed legged robots have only been able to do this to a limited extent. 

Takahiro Miki is a doctoral student and lead author of the study. 

“The reason is that the information about the immediate environment recorded by laser sensors and cameras is often incomplete and ambiguous,” said Miki. 

“That’s why robots like ANYmal have to be able to decide for themselves when to trust the visual perception of their environment and move forward briskly, and when it is better to proceed cautiously and with small steps,” Miki continued. “And that’s the big challenge.”

How robots learn to hike

Training the Neural Network

The new technology includes a controller based on a neural network, which enables ANYmal to combine external and proprioceptive perception for the first time. The scientists first exposed the system to numerous obstacles and sources of error in a virtual training camp, which allowed the network to learn how to overcome obstacles in the best way. It also learned when to rely on environmental data and when to ignore it. 

“With this training, the robot is able to master the most difficult natural terrain without having seen it before,” says Hutter.

The robot can carry out this process even when the sensor data on the immediate environment is ambiguous or vague, at which point ANYmal relies on its proprioception. This allows it to combine the speed and efficiency of external sensing with the safety of proprioceptive sensing. 

Alex McFarland is a Brazil-based writer who covers the latest developments in artificial intelligence & blockchain. He has worked with top AI companies and publications across the globe.