• Automated Driving
  • Automated Vehicles
  • Automated Driving Systems

New Modes of Seeing the World Coming to Automated Vehicles

Sam Abuelsamid
Nov 27, 2019

Smart Car 3

For the past 15 years, red-green-blue (RGB) cameras, radar, and lidar have largely been the holy trinity of sensors to enable automated vehicles to see the world. That perception suite is starting to evolve as an array of new technologies begin to mature and engineers seek to build more resilient automation. Thermal imaging, near-infrared, and high definition radars are coming to market in the next several years. These technologies will enable virtual drivers to better understand the world around the vehicle.

How to See

Most engineers agree that multiple, orthogonal sensing modes will be key to enabling software to interpret the environment. Exceptions include Tesla and a number of startups focused on reducing the costs of building automated driving with fewer sensors to minimize the amount of sensor data needed. Two-dimensional images from RGB cameras that aren’t so different from those in our smartphones can be used to classify objects. But even the most advanced machine learning systems still make errors that a child could spot. RGB cameras have far less dynamic range and resolution than the human eye.

Lidar and radar can provide far more accurate measurements of the speed, distance, and orientation of objects around the road. Accurate orientation is crucial for prediction algorithms to forecast where other road users are going in the next several seconds before the automation can plan a path. But lidar remains costly. Current radars have severe limitations in understanding individual targets.

Developing Tech

Thermal imaging sensors from suppliers ADASKY and FLIR Systems are both expected to come to production vehicles by 2021. These sensors can measure accurate temperature gradients even in complete darkness. Using many of the same image recognition algorithms already developed for RGB cameras, thermal images can use the temperature data to more accurately recognize living targets like pedestrians and animals.

Meanwhile, HD imaging radars from Magna, Arbe Robotics, and others create thousands of virtual signals across the field of view for an almost lidar-like point cloud. This cloud measures instantaneous distance and velocity across the field of view. This can overcome current radar’s inability to distinguish a parked vehicle from stationary roadside objects while maintaining the ability to see through fog, rain, and snow. Near-infrared is another mode that can see through bad weather and poor light. TriEye demonstrates this with its short-wave infrared camera, and Ouster does with its 850nm lidar. Both of these can generate images that can also use existing image classification algorithms.

Combining multiple sensor types into a single package allows simpler installation and improved performance. AEye does this with its blended RGB camera and lidar system. AEye uses the object classification to guide a variable scan pattern for its laser that enables increased point density on objects of interest.

Improving Existing Systems

Add the extended situational awareness provided by HD maps and expanded connectivity and the perception systems for automated driving will be far more robust and operable in a wider range of conditions than cameras alone in the coming years. However, some of these technologies will be arriving well before highly automated vehicles are widespread. One of the near-term drivers for these new sensors is the desire to improve the capability of the driver assist systems that are already being deployed. ADASKY’s thermal imager and Continental’s flash lidar will be used for pedestrian detection automatic braking systems by 2021. The way toward making vehicles safer will include more sensor types, not less. Fortunately, the scale from widespread adoption for driver assists will help bring down costs for full automation later.