Driving in snow is a team effort for AI sensors – sciencedaily
No one likes driving in a snowstorm, including autonomous vehicles. To make self-driving cars safer on snow-covered roads, engineers look at the problem from the car’s perspective.
A major challenge for fully autonomous vehicles is navigation in bad weather. Snow in particular confuses data from crucial sensors that help a vehicle measure depth, find obstacles and stay on the correct side of the yellow line, assuming it is visible. With an average of over 200 inches of snow each winter, Michigan’s Keweenaw Peninsula is the perfect place to push autonomous vehicle technology to its limits. In two papers presented at SPIE Defense + Commercial Sensing 2021, researchers from Michigan Technological University discuss solutions for snow-covered driving scenarios that could help provide autonomous driving options in snow-covered cities like Chicago, Detroit, Minneapolis and Toronto. .
Much like the weather at times, range is not a sunny or snowy yes-no designation. Autonomous vehicles cover a spectrum of levels, from cars already on the market with blind spot warning or brake assist, to vehicles that can switch in and out of autonomous driving modes, to others that can navigate entirely by them. themselves. Major automakers and research universities are still refining technology and algorithms for autonomous driving. Sometimes accidents happen, either due to artificial intelligence (AI) misjudging the car, or due to human driver misuse of autonomous driving functions.
Humans also have sensors: our scanning eyes, our sense of balance and movement, and the processing power of our brains help us understand our surroundings. These seemingly basic inputs allow us to drive in virtually any scenario, even if they are new to us, because the human brain is good at generalizing new experiences. In autonomous vehicles, two cameras mounted on gimbals scan and perceive depth using stereo vision to mimic human vision, while balance and movement can be measured using an inertial measurement unit. But computers can only react to scenarios that they have encountered before or have been programmed to recognize.
Since artificial brains aren’t here yet, task-specific artificial intelligence (AI) algorithms need to take the wheel – meaning autonomous vehicles need to rely on multiple sensors. Fisheye cameras widen the view while other cameras act much like the human eye. Infrared captures thermal signatures. Radar can see through fog and rain. Light sensing and ranging (lidar) pierces the darkness and weaves a neon tapestry of laser beam threads.
“Each sensor has limits, and each sensor covers the back of another,” said Nathir Rawashdeh, assistant professor of computer science at Michigan Tech’s College of Computing and one of the study’s lead investigators. It works to bring data from the sensors together through an AI process called sensor fusion.
“Sensor fusion uses multiple sensors of different modalities to understand a scene,” he said. “You can’t program exhaustively for every detail when the inputs have difficult patterns. That’s why we need AI.”
Rawashdeh’s collaborators in Michigan Tech include Nader Abu-Alrub, his doctoral student in electrical and computer engineering, and Jeremy Bos, assistant professor of electrical and computer engineering, as well as master’s students and graduates from Bos’s lab: Akhil Kurup , Derek Chopp and Zach Jeffries. Bos explains that lidar, infrared and other sensors are on their own like the hammer of an old saying. “” For a hammer, everything looks like a nail, “said Bos. “Well if you have a screwdriver and a rivet gun you have more options.”
Most autonomous sensors and autonomous driving algorithms are developed in sunny and clear landscapes. Knowing that the rest of the world is not like Arizona or Southern California, Bos’s lab began collecting local data in a Michigan Tech autonomous vehicle (driven safely by a human) during severe storms. snowfall. The Rawashdeh team, including Abu-Alrub, dumped over 1,000 images of lidar, radar, and image data from snow-covered roads in Germany and Norway to start teaching their AI program what snow looks like and how. see beyond.
“Not all snow is created equal,” said Bos, noting that the variety of snow makes detecting sensors a challenge. Rawashdeh added that preprocessing the data and ensuring accurate labeling is an important step in ensuring accuracy and safety: “AI is like a chef – if you have good ingredients, there will be great meal, ”he said. “Give the AI learning network dirty sensor data and you will get a bad result.”
Poor quality data is a problem, as is real dirt. Much like road dirt, snow accumulation on the sensors is a solvable but annoying problem. Once the view is clear, the sensors of autonomous vehicles still do not always agree on obstacle detection. Bos mentioned a great example of finding a deer while cleaning up data collected locally. Lidar said the blob was nothing (30% chance of an obstacle), the camera saw it as a human asleep at the wheel (50% chance) and the infrared sensor shouted WHOA (90% sure that it is a deer).
Getting the sensors and their risk assessments to talk to and learn from each other is like the Indian parable of three blind men who find an elephant: each one touches a different part of the elephant – the ear, trunk and head. creature’s leg – and come to a different conclusion about what kind of animal it is. Using sensor fusion, Rawashdeh and Bos want autonomous sensors to collectively find the answer – whether it’s an elephant, a deer, or a snowbank. As Bos says, “Rather than voting strictly, using sensor fusion, we will come up with a new estimate.”
While Keweenaw snowstorm navigation is a solution for autonomous vehicles, their sensors can learn more about inclement weather and, with advancements such as sensor fusion, will one day be able to drive safely on snow-covered roads. .