stub AI Uses Visual Appearance to Estimate Distances for Drones - Unite.AI
Connect with us

Robotics

AI Uses Visual Appearance to Estimate Distances for Drones

Published

 on

Image: Guido de Croon, TU Delft

A new optical flow-based learning process developed by a team of researchers at TU Delft and the Westphalian University of Applied Sciences enables robots to estimate distances through the visual appearance of objects in view. The visual appearance can include factors like shape, color, and texture. 

By using this AI-based learning strategy, the navigation of small flying drones can be improved. 

The article was published last month in Nature Machine Intelligence. 

Robots vs. Insects

In order to enable small flying robots to possess the same level of autonomy we see in large self-driving vehicles, they will need to demonstrate the same developed intelligence present in flying insects, which can be done through highly efficient AI systems. 

The small flying robots that are currently on the market do not carry the necessary amount of sensors and processing power onboard, which is one of the greatest challenges surrounding this technology. 

In the natural world, insects rely on ‘optical flow,’ which is how objects move in an insect's view. This optical flow is what enables them to land on flowers and evade predators. What is surprising about this optical flow is that it is simple, despite being used for complex tasks. 

Guido de Croon is a professor of Bio-inspired Micro Air Vehicles and first author of the article. 

“Our work on optical flow control started from enthusiasm about the legant, simple strategies employed by flying insects,” he said. “However, developing the control methods to actually implement these strategies in flying robots turned out to be far from trivial. For example, our flying robots would not actually land, but they started to oscillate, continuously going up and down, just above the landing surface.”

Enhancing optical-flow-based control by learning visual appearance cues for flying robots

Optical Flow

There are two main limitations to optical flow. First, it provides mixed information on distance and velocities, and it does not provide information on each of the two separately. Second, the optical flow is very small in the direction the drone is moving, which has implications for obstacle avoidance. In other words, the robot has the most difficulty detecting objects it is moving towards.

“We realized that both problems of optical flow would disappear if the robots were able to interpret not only optical flow, but also the visual appearance of objects in their environment,” said Guido de Croon. “This would allow robots to see distances to objects in the scene similarly to how we humans can estimate distances in a still picture. The only question was: How can a robot learn to see distances like that?”

In the new approach developed by the researchers, the robots rely on oscillations to learn what objects in their environment look like depending on the distance. For example, a drone can learn how fine the texture of grass is depending on the height it is at during landing. 

Christophe De Wagter is a researcher at TU Delft and co-author of the article. 

“Learning to see distance by means of visual appearance led to much faster, smoother landings than we achieved before,” he said. “Moreover, for obstacle avoidance, the robots were now also able to see obstacles in the flight direction very clearly. This did not only improve obstacle detection performance, but also allowed our robots to speed up.”

The new development will have implications for flying robots that have limited resources, and it is specifically useful for those operating in a confined environment. 

 

Alex McFarland is a tech writer who covers the latest developments in artificial intelligence. He has worked with AI startups and publications across the globe.