stub Researchers Find That Hearing Can Improve Robot Perception - Unite.AI
Connect with us

Robotics

Researchers Find That Hearing Can Improve Robot Perception

Published

 on

Today's robots mostly rely on vision and touch to operate in their environments, but this could change with new research coming out of Carnegie Mellon University (CMU). A group of researchers found that robot perception could improve through hearing. 

The research was the first large-scale study of the interactions between sound and robotic action, according to the researchers at CMU’s Robotics Institute. The group found that sound can play a role in helping robots differentiate between objects. Through the use of hearing, robots could also determine what type of action caused a sound, and sounds could help predict the physical properties of new objects.

Lerrel Pinto is Ph.D in robotics at CMU and will be part of the faculty of New York University this fall.

“A lot of preliminary work in other fields indicated that sound could be useful, but it wasn't clear how useful it would be in robotics,” said Pinto.

Pinto and the group of researchers found that the performance rate was fairly high, specifically a 76 percent accuracy rate for robots using sound to successfully classify objects. 

Because the results fit what the group was looking for, they will now explore other options, such as equipping robots with instrumented canes. This will enable them to identify objects by tapping on them.

The group’s findings were presented in July at the virtual Robotics Science and Systems conference. The team also included Abhinav Gupta, association professor of robotics, and Dhiraj Gandhi, a research scientist at Facebook Artificial Intelligence Research’s Pittsburgh Lab.

Tilt-Bot in Action

The Study and Dataset

The study was conducted with the researchers creating a large dataset and simultaneously recording video and audio of 60 common objects. Some of those objects were toy blocks, hand tools, apples, shoes and tennis balls. The recordings took place as the objects rolled and crashed around a tray.

The dataset was then released after cataloging around $15,000 interactions. 

Tilt-Bot 

The interactions were captured by what the team calls Tilt-Bot, which is a square tray connected to a Sawyer robot arm. The Sawyer was also used to push objects on a surface, collecting other types of data.

The study of how sound can affect intelligent robots has been around for awhile, but what is new is the massive dataset. 

One of the new findings was that a robot could use its experience and what it learned about the sound of a certain set of objects in order to predict the physical properties of another set of unseen objects.

“I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” Pinto said. For instance, a robot couldn't use sound to tell the difference between a red block or a green block. “But if it was a different object, such as a block versus a cup, it could figure that out.”

The research was supported by the Defense Advanced Research Projects Agency and the Office of Naval Research.

 

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.