stub WiFi Helps Robots Navigate indoor Environments - Unite.AI
Connect with us


WiFi Helps Robots Navigate indoor Environments



Engineers at the University of California San Diego have developed a low cost, low power technology that helps robots map their indoor environments. The system helps the robots navigate even when there is low lighting or no recognizable landmarks or features. 

The team of researchers belong to the Wireless Communication Sensing and Networking Group, which is led by UC San Diego electrical and computer engineering professor Dinesh Bharadia. It will be presented at the 2022 International Conference on Robotics and Automation (ICRA) in Philadelphia, which runs from May 23 to 27. 

The research was published in IEEE Robotics and Automation

A Brand New Approach

The newly developed technology has sensors that rely on WiFi signals to enable the robot to map its environment and path. The system is a brand new approach for indoor robot navigation, and it is unique when compared to previous ones that use optical light sensors like cameras and LiDARs.

The “WiFi” sensors use radio frequency signals instead of light or visual cues to see, which enables them to work better in environments where cameras and LiDARs have trouble. These types of environments are usually low light, changing light, and repetitive environments like long corridors. 

WiFi helps robots navigate indoors

Alternative to LiDARs

The WiFi helps the technology achieve its status as an economical alternative to LiDARs, which are expensive and require a lot of power. 

“We are surrounded by wireless signals almost everywhere we go. The beauty of this work is that we can use these everyday signals to do indoor localization and mapping with robots,” said Bharadia.

Aditya Arun is an electrical and computer engineering Ph.D. student in Bharadia’s lab and first author of the study. 

The researchers built the prototype system with off-the-shelf hardware. It consists of a robot equipped with WiFi sensors built from commercially available WiFi transceivers. These WiFi sensors transmit and receive wireless signals to and from WiFi access points in the environment, and this communication is what enables the robot to map its location and direction of movement. 

Roshan Ayyalasomayajula is also an electrical and computer engineering Ph.D. student in Bharadia’s lab, as well as a co-author of the study. 

“This two-way communication is already happening between mobile devices like your phone and WiFi access points all the time — it's just not telling you where you are,” said Ayyalasomayajula. “Our technology piggybacks on that communication to do localization and mapping in an unknown environment.”

The WiFi sensors are first unaware of the robot’s location and where the WiFi access points are in the environment. As the robot moves, the sensors call out the access points and listen for their replies, which are then used as landmarks. 

Every incoming and outgoing wireless signal carries its own unique physical information that can be used to identify where the robots and access points are in relation to each other. The algorithms enable the WiFi sensors to extract this information and make these calculations. The sensors continue to pick up more information and can eventually locate where the robot is going. 

The technology was tested on a floor of an office building, where several access points were placed around the space. A robot was then equipped with the WiFi sensors, as well as a camera and a LiDAR to perform measurements for comparison. The team controlled the robot and made it travel several times around the floor. It also turned corners and went down long and narrow corridors with brightly and dimly lit spaces. 

The tests demonstrated that the accuracy of localization and mapping provided by the WiFi sensors was on par with that of the commercial camera and LiDar sensors. 

“We can use WiFi signals, which are essentially free, to do robust and reliable sensing in visually challenging environments,” said Arun. “WiFi sensing could potentially replace expensive LiDARs and complement other low cost sensors such as cameras in these scenarios.”

The team will now work to combine WiFi sensors and cameras to develop an even more complete mapping technology.

Alex McFarland is a Brazil-based writer who covers the latest developments in artificial intelligence. He has worked with top AI companies and publications across the globe.