Connect with us

Robotics

Scientists Use Robot to Understand Ant Communication

Published

 on

A team of scientists at the University of Bristol has developed a small robot that helps them understand how ants teach one another. The robot was built to mimic the behavior of rock ants, which rely on one-to-one tuition. 

This one-to-one tuition is what allows an ant that discovers a better nest to teach another individual ant the route to get there. 

The team’s findings were published in the Journal of Experimental Biology

Understanding “Teaching” Ants

This new knowledge opens up many possibilities since it means the important elements of teaching among these ants is now largely understood, with the teaching ant able to be replaced by a machine. 

One of the main aspects of this new teaching process involves one ant leading another ant slowly along a route to reach the new nest. The following ant learns the route sufficiently, enabling it to return home and lead another ant to the new nest. This process continues one ant at a time. 

Nigel Franks is a professor at Bristol’s School of Biological Sciences. 

“Teaching is so important in our own lives that we spend a great deal of time either instructing others or being taught ourselves,” Prof. Franks says. “This should cause us to wonder whether teaching actually occurs among nonhuman animals. And, in fact, the first case in which teaching was demonstrated rigorously in any other animal was in an ant.” 

The team set out to better understand this teaching, believing that if they could replace the teacher, they would largely understand all of the main elements of the process. 

Constructing and Testing the Bots

To achieve this, the researchers constructed a large arena with a distance between the ants’ old nest, which was purposely made to be low quality, and the new and improved nest. In order to direct the robot to move along either straight or wavy routes, the team placed a gantry on top of the arena that could move back and forth with a small sliding robot attached to it. They then attached attractive scent glands from a worker ant to the robot, which gave it the pheromones of an ant teacher. 

“We waited for an ant to leave the old nest and put the robot pin, adorned with attractive pheromones, directly ahead of it,” Prof. Franks said. “The pinhead was programmed to move towards the new nest either on a straight path or on a beautifully sinuous one. We had to allow for the robot to be interrupted in its journey, by us, so that we could wait for the following ant to catch up after it had looked around to learn landmarks.” 

When the follower ant had been led by the robot to the new nest, we allowed it to examine the new nest and then, in its own time, begin its homeward journey. We then used the gantry automatically to track the path of the returning ant,” he continued. 

The team discovered that the robot successfully taught the route to the apprentice ants, and the ants knew how to get back to the old nest whether they took a winding or straight path. 

 “A straight path might be quicker but a winding path would provide more time in which the following ant could better learn landmarks so that it could find its way home as efficiently as if it had been on a straight path,” Prof. Franks continued. 

“Crucially, we could compare the performance of the ants that the robot had taught with ones that we carried to the site of the new nest and that had not had an opportunity to learn the route. The taught ants found their way home much more quickly and successfully.”

The team of scientists also included undergraduates Jacob Podesta, a current PhD student at York, and Edward Jarvis, a former Masters student at Professor Frank’s lab. Also participating in the study was Dr. Alan Workley and Dr. Ana Sendova-Franks.

Alex McFarland is a Brazil-based writer who covers the latest developments in artificial intelligence & blockchain. He has worked with top AI companies and publications across the globe.