stub Researchers Develop Amphibious Artificial Vision System - Unite.AI
Connect with us

Artificial Intelligence

Researchers Develop Amphibious Artificial Vision System

Updated on

Artificial vision systems are used across industries for a wide range of applications, such as autonomous vehicles, object detection, and smart cameras. These systems are often inspired by biological organisms, but current artificial visions face several limitations. For one, they are not usually suitable for imaging both land and underwater environments. They are also usually limited to a hemispherical field-of-view (FOV). 

Novel Artificial Vision System

A team of researchers from Korea and the United States set out to overcome these limitations by designing a novel artificial vision system with an omnidirectional imaging ability, which works in both aquatic and terrestrial environments. 

The study was published in the journal Nature Electronics

Professor Young Min Song from Gwangju Institute of Science and Technology in Korea was involved with the work. 

“Research in bio-inspired vision often results in a novel development that did not exist before. This, in turn, enables a deeper understanding of nature and ensures that the developed imaging device is both structurally and functionally effective,” says Prof. Song. 

Inspired by Nature

The team drew inspiration from the fiddler crab, which is a terrestrial crab species with amphibious imaging ability and a 360 degree FOV. The crab has these features thanks to its ellipsoidal eye stalk of its compound eyes, which enable panoramic imaging. It also has flat corneas with a graded refractive index profile, which enables amphibious imaging. 

The researchers developed a vision system with an array of flat micro-lenses with a graded refractive index profile, which was integrated into a flexible silicon photodiode array. It was then mounted onto a spherical structure. 

The graded refractive index and the flat surface of the micro-lens were optimized to help offset the defocusing effects brought on by changes in the external environment. This can all sound complex and confusing, but the team says it can be thought of as making light rays traveling in different mediums to focus at the same spot. 

Testing the System

The team then set out to test the system’s capabilities. They performed optical simulations and imaging demonstrations in air and water, and amphibious imaging was carried out by immersing the device halfway in water. The images produced by the system were clear, and the team was able to demonstrate that the system had a panoramic visual field of 300 degrees horizontally and 160 degrees vertically in both air and water. The spherical mount measured in at just 2 centimeters in diameter, helping make the system compact and portable. 

“Our vision system could pave the way for 360° omnidirectional cameras with applications in virtual or augmented reality or an all-weather vision for autonomous vehicles,” says Prof. Song.

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.