One of the greatest challenges facing autonomous vehicles is that they struggle navigating in bad weather conditions, which really limits their implementation in snowy cities like Detroit and Chicago. The vehicles rely on crucial sensor data to detect obstacles and keep on the correct side of the road, but this data struggles in snow.
In two new papers presented in SPIE Defense + Commercial Sensing 2021, researchers from Michigan Technological University discussed new solutions for snowy driving scenarios with autonomous vehicles.
There are a wide range of autonomous vehicles, including some with blind spots or braking assistance, and others with on and off self-driving modes. Some of the best vehicles can operate entirely on their own.
Because the technology is still in its infancy in many ways, automakers and research universities are continuously working on improving the technology and algorithms. When accidents do occur, they are often the result of a misjudgment by the car’s AI or a human error.
Human eyes are also a form of sensors, as they sense balance and movement. Our brain acts as a processor, helping us understand our environment. These together enable us to drive in all scenarios, even those that are new, since our brains can generalize novel experiences.
Autonomous vehicles usually have two cameras mounted on gimbals, and they scan and perceive depth using stereo vision to mimic human vision. At the same time, balance and motion can be measured with an inertial measurement unit. Computers on the other hand can only react to previously encountered scenarios or those they have already been programmed to recognize.
Autonomous vehicles rely on task-specific artificial intelligence algorithms, which require multiple sensors like fisheye cameras, infrared sensors, radar, light detection, and lidar.
Nathir Rawashdeh is assistant professor of computing in Michigan Tech’s College of Computing and one of the lead authors of the study.
“Every sensor has limitations, and every sensor covers another one’s back,” Rawashdeh said. “Sensor fusion uses multiple sensors of different modalities to understand a scene. You cannot exhaustively program for every detail when the inputs have difficult patterns. That’s why we need AI.”
The study’s collaborators included Nader Abu-Alrub, doctoral student in electrical and computer engineering, and Jeremy Bos, assistant professor of electrical and computer engineering. Other collaborators included master’s degree students and graduates from Bos’ lab: Akhil Kurup, Derek Chopp, and Zach Jeffreies.
Autonomous sensors and self-driving algorithms are almost exclusively developed in sunny and clear landscapes. Bos’ lab first began to collect data in a Michigan Tech autonomous vehicle in the heavy snow, and over 1,000 frames of lidar, radar, and image data were collected from snowy roads in Germany and Norway.
According to Bos, sensor detection is hard due to a variety of snow. It is important to pre-process the data and ensure accurate labeling.
“All snow is not created equal,” Bos said “AI is like a chef — if you have good ingredients, there will be an excellent meal,” he said. “Give the AI learning network dirty sensor data and you’ll get a bad result.”
Some other major challenges involve low-quality data and dirt, and snow buildup on the sensors causes its own problems. Even after the sensors are cleared, there is not always agreement in detecting obstacles. It is often really difficult to get the sensors and their risk assessments to communicate and learn from each other, as each one can come to its own conclusion. However, the team wants autonomous sensors to collectively come to a conclusion by using sensor fusion.
“Rather than strictly voting, by using sensor fusion we will come up with a new estimate,” Bos says.
Autonomous vehicle sensors will continue to learn and improve in bad weather, and new approaches like sensor fusion could lead the way for autonomous vehicles on snowy roads.