Researchers at the Massachusetts Institute of Technology have developed a fully-integrated robotic arm that fuses visual data from a camera and radio frequency (RF) information from an antenna to locate and retrieve lost objects. It can locate objects even if they are buried or out of view.
The system is called RFusion, and the prototype relies on RFID tags, which are inexpensive, battery-less tags that reflect signals sent by an antenna. RF signals can travel through most surfaces, so RFusion can locate a tagged item even if it is buried.
The robotic arm uses machine learning to automatically zero-in on the object’s exact location. It can then move the items on top of it, grasp the object, and verify that it is the correct object. The camera, antenna, robotic arm, and AI are fully integrated, meaning the system can operate in any environment without requiring extensive set up.
According to the researchers, RFusion could be used for applications like sorting through piles to fulfill orders in a warehouse, or to identify and install components in an auto manufacturing plant.
Fadel Adib is senior author and associate professor in the Department of Electrical Engineering and Computer Science. Adib is also director of the Signal Kinetics group in the MIT Medical Lab.
“This idea of being able to find items in a chaotic world is an open problem that we’ve been working on for a few years. Having robots that are able to search for things under a pile is a growing need in industry today. Right now, you can think of this as a Roomba on steroids, but in the near term, this could have a lot of applications in manufacturing and warehouse environments,” said Adib.
Other co-authors of the research include research assistant Tara Boroushaki, the lead author; electrical engineering and computer science graduate student Isaac Perper; research associate Mergen Nachin; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering.
The research is set to be presented at the Association for Computing Machinery Conference on Embedded Networked Sensor Systems next month.
Training the System
The researchers used reinforcement learning to train a neural network that can optimize the robot’s trajectory to an object.
“This is also how our brain learns. We get rewarded from our teachers, from our parents, from a computer game, etc. The same thing happens in reinforcement learning. We let the agent make mistakes or do something right and then we punish or reward the network. This is how the network learns something that is really hard for it to model,” Boroushaki explains.
The optimization algorithm in RFusion was rewarded when it limited the number of moves to localize the item and the distance traveled to pick it up.
The researchers tested the system in several environments, including one in which a keychain was buried in a box full of clutter and a remote control hidden under a pile of items on a couch.
They took the approach of summarizing the RF measurements and limiting the visual data to the area right in front of the robot in order to not overwhelm the system. This resulted in a 96 percent success rate when retrieving objects fully hidden under a pile.
“Sometimes, if you only rely on RF measurements, there is going to be an outlier, and if you rely only on vision, there is sometimes going to be a mistake from the camera. But if you combine them, they are going to correct each other. That is what made the system so robust,” Boroushaki says.
Matthew S. Reynolds is CoMotion Presidential Innovation Fellow and associate professor of electrical and computer engineering at the University of Washington.
“Every year, billions of RFID tags are used to identify objects in today’s complex supply chains, including clothing and lots of other consumer goods. The RFusion approach points the way to autonomous robots that can dig through a pile of mixed items and sort them out using the data stored in the RFID tags, much more efficiently than having to inspect each item individually, especially when the items look similar to a computer vision system,” says Reynolds. “The RFusion approach is a great step forward for robotics operating in complex supply chains where identifying and ‘picking’ the right item quickly and accurately is the key to getting orders fulfilled on time and keeping demanding customers happy.”
The researchers will now look to increase the speed of the system to move it smoothly.