Cleantech Market Intelligence
Weather Still Poses a Hurdle for ADAS and Autonomous Vehicles
Prior to the 20th century, travelers had to directly contend with the vagaries of weather as they made their way from place to place. The invention of the automobile helped put a layer of abstraction between humans and the environment, and we can now sit in relative comfort and look out through windshields that are continuously wiped as we make our way through rain, sleet, and snow. As engineers develop the coming generations of cars that rely on high-tech sensors to observe the world and drive themselves, those sensors now have to deal directly with the elements that humans once faced.
Getting a car to reliably drive itself around a Las Vegas parking lot or the streets of Silicon Valley is by no means a trivial task, but engineers from most of the top automotive OEMs and suppliers have managed to work out the fundamentals. Navigant Research’s Autonomous Vehicles report projects that there will be almost 85 million autonomous-capable vehicles on the road globally by 2035. However, before we can all tear up our driver’s licenses, engineers will have to figure out to do the same thing in wintry Detroit and monsoon-soaked Mumbai.
In recent months, I’ve evaluated many new vehicles with the latest and greatest advanced driver assist systems (ADAS), including lane departure prevention, forward collision warning with automatic braking, adaptive cruise control, and blindspot monitors. These functions are the building blocks of autonomous vehicles. Engineers are now fusing signals from the ADAS sensors with high-definition maps and new LIDAR sensors to create a comprehensive image of the world around the car. On top of this sensor fusion image, they are building control algorithms to make the decisions about when to accelerate, brake, and steer.
Unfortunately, if the sensors can’t reliably see the world through the elements, the algorithms just shut down. Such was the case recently while driving a 2016 Hyundai Sonata hybrid during a snowy morning rush hour. The radar-based adaptive cruise control does a wonderful job of tracking the vehicle ahead as speeds fluctuate, reducing the workload on the driver. However, after about 15 miles, the radar sensor became so caked up with slush that a warning came in the instrument panel that the system was disengaging and the sensor should be cleaned. Similarly, the camera on a Volvo XC90 was unable to see the lane markers on a dark, rainy morning and disengaged the lane departure warning system.
The Road Ahead
In January 2016, Ford became the first automaker to test autonomous vehicles in snowy conditions at the Mcity test track in Ann Arbor, Michigan. With snow covering the roads, features like lane markings and even curbs were not visible. Last summer, Ford scanned the track with the LIDAR sensors on its autonomous Fusion vehicles to produce a high-definition 3D map that included all of the physical features and landmarks. Ford used this map to locate the Fusion in the snow by matching those landmarks such as signs and buildings even without seeing the road surface.
Delphi has been demonstrating an autonomous Audi Q5 that replaces the spinning rooftop LIDAR sensors with solid-state sensors from Quanergy Systems mounted in the corners of the vehicle. Quanergy, Valeo, and other sensor suppliers are developing sophisticated digital filtering algorithms to help the sensors see better through falling rain and snow. While all of these approaches are helping to make autonomous control more robust, the vision, radar, and LIDAR sensors still need to be kept clear in order to provide this capability. In time, these problems will probably be overcome, but it will take a lot more effort.