International robotics researchers have developed an Urban Search And Rescue (USAR) system that uses a ‘steerable’ live cockroach to transport a miniaturized machine learning-enabled infrared survivor detection system through otherwise unnavigable disaster scenarios.
The hybrid rescue system features a machine learning model for human detection that was trained on infrared (IR) images, and powers the mobile IR detection system, which can operate autonomously as necessary, reporting found survivors back to a base operating station.
Limited Local Resources
The machine learning framework is required to operate on extraordinarily slim resources: only 191.8kB of static RAM and 1988kB of Flash memory are left for the system from the general power requirements for the device, which must also provide the electrical stimulus for the insect.
The three functional blocks of the cockroach’s backpack, pictured above, include wireless stimulation, a primary controller unit and peripheral components, with the IR-derived machine learning system and the navigation functionality embedded into the primary controller unit. The rig’s circuit has been split into several components in order to better fit the geometry of the cockroach.
The Madagascar cockroach (among the largest species in the world, with a maximum load capacity of 15g) is controlled by small electrical shocks that steer it in one direction or another, implemented by four electrodes implanted into the creature’s antennae (cerci), and into its abdomen. The electrodes are secured with beeswax.
Adding AI To Insect Search And Rescue
The new initiative develops prior work from UC Berkeley and Singapore’s Nanyang Technological University, which first conceived of using steerable beetles in USAR scenarios.
Though beetles have the added ability of flight, their load capacity is subsequently reduced, lowering the potential capabilities of onboard technologies, and bringing power consumption demands to a critical level, particularly in a case where it’s necessary to run a machine learning algorithm.
The motion of the cockroach is directed by the integrated navigation system, and guides the ‘biobot’ to a predetermined destination without any knowledge of the intervening obstacles. For the most part, the cockroach’s own formidable navigational skills solve most of the problems in arriving at an otherwise unreachable location.
The onboard infrared system captures images at 1hz, operating successfully in dark areas, and wirelessly reporting any located survivors to a command center in real time. To conserve energy, the image analysis system only begins operating if triggered by a positive infrared reading.
The model occupies a mere 18.3kB of Flash memory and 52.2kB of static RAM, achieving a computation time of 95 milliseconds. Reasonable processing time is essential in a USAR scenario, since distant infrared signatures could otherwise be missed in the processing interval as the cockroach changes direction and traverses terrain.
The system also features sensors to monitor temperature, humidity and CO2, in order to report local conditions for a possible rescue attempt, and to allow the control center to steer the creature away from any situation that would endanger it.
Testing In Simulated Terrains
The system was tested in a simulated disaster scenario (image above), with effective obstacle negotiation, except in certain particularly challenging configurations involving vertiginous climbs, since the cockroach is at a weight disadvantage due to the attached equipment.
The navigation architecture initially had a tendency to get stuck at overwhelming obstacles, such as situations where even the cockroach could proceed no further, and the researchers subsequently developed a predictive feedback navigation system to improve performance in the face of tall obstacles. The system was able to achieve a 100% success rate in environments with no or low obstacles, and a higher success rate with tall obstacles.
Where failure did occur, the researchers conclude that this could be remediated by increasing the duration of the experiment, though this logically has implications in a time-critical USAR scenario.
The onboard infrared camera has modest specifications, operating at 32×32 pixels with a 90-degree field of view. The images, when triggered, are passed through a median denoising filter.
The system achieves an 87% success rate in distinguishing human subjects from other types of thermal signature, rising to 90% when in a proximity radius of 0.5m and 1.5m.
Due to energy and chip-size constraints, the initial study does not feature an onboard localization system, and thus it’s not possible to track the position of the cockroach in real time. The researchers suggest that dead reckoning could be implemented as a power-saving solution, with low-energy location signals relayed back to the control center, in future implementations.
Insects As Search And Rescue Operators
The last ten years have brought a spate of research projects seeking to utilize the resilience and navigational power of insects to create hybrid or pure robotic systems for search and rescue scenarios. Besides the 2016 beetle-related work which precedes this latest initiative, there have been a number of attempts to recreate insectoid capabilities in purely robotic form.
These include a 2019 research project from UoC which offered an ergonomically simple robot based on the principles of a cockroach, one of the first projects of its kind to address the extreme fragility of robotics insects.
- Do Conversational Agents Like Alexa Affect How Children Communicate?
- Hobbling Computer Vision Datasets Against Unauthorized Use
- Faisal Ahmed. Co-Founder & CTO at Knockri – Interview Series
- The Shortcomings of Amazon Mechanical Turk May Threaten Natural Language Generation Systems
- AI Chipmaker Deep Vision Raises $35 Million in Series B Funding