Navigant Research Blog

Volvo Pioneers Autonomous Vehicles

— March 17, 2015

Volvo has long sold cars that are considered among the safest in the world. Since the 1940s, Volvo has been at the forefront of introducing innovations that include laminated safety glass, crush zones, three-point seatbelts, and more recently, pedestrian detection with automatic braking. As Volvo prepares to launch its first all-new production vehicle since being acquired by China’s Geely Group, the company has announced plans for a test of highly automated vehicles on public roads near its Gothenburg, Sweden headquarters.

Self-Driving Cars a Reality

Self-driving vehicles from automakers, suppliers, and technology companies have become commonplace recently on Silicon Valley roads. However, all of those vehicles are under the control of the engineers trying to refine the complex control software required to make them work reliably. Beginning in 2017, Volvo plans to put a fleet of 100 autopilot-equipped XC90 SUVs into the hands of regular Swedish drivers.

Reiterating its oft-stated goal of achieving sustainable mobility and a crash-free future, Volvo has worked to design the autopilot system it is building into the XC90 to be robust enough to let ordinary drivers give  complete control.

“Making this complex system 99% reliable is not good enough, you need to get much closer to 100% before you can let self-driving cars mix with other road users in real-life traffic,” Erik Coelingh, technical specialist at Volvo, told me. With that in mind, Volvo has recognized the limitations of current technology, so the XC90 will be equipped with a combined array of radar, lidar, ultrasonic, and camera sensors.

Sensor Array on Autonomous Volvo XC90

(Source: Volvo)

Coelingh acknowledges that there are some fundamental problems that cannot be overcome. For example, lidar sensors cannot see through fog or rain and cameras cannot see lane markers that are obscured by snow. In addition to using multiple sensor types, Volvo is taking care in packaging the sensors to minimize the risk of obstruction from the elements such as snow and salt buildup.

The goal is to allow drivers to spend time on secondary tasks without constantly monitoring the system. The vehicles will be able to execute automatic lane changes and enter and exit a limited access highway. Soft degradation of the system will extend the time between the driver being alerted and when they have to take over. If the driver does not respond by taking over control in a timely manner, the vehicle will attempt to pull over and come to a safe stop.

Fully Autonomous vs. Self-Driving

Despite all of that, there is an important distinction between vehicles that are capable of fully autonomous operation and those that are entirely self-driving. The Volvo falls into the former category, with the ability to handle the driving when conditions permit, while reverting to human control in many scenarios. Google’s prototype pod car, which was designed without a steering wheel or pedals, is in the latter category. For the foreseeable future, driverless vehicles are likely to remain restricted to closed environments where they don’t need to interact with traditional vehicles.

As detailed in Navigant Research’s report, Autonomous Vehicles, 40% of new vehicles will have some form of automated driving capability by 2030. The bulk of those are likely to be similar in concept to what Volvo will be testing on Swedish roads in 2017. Although consumer surveys have indicated strong interest in autonomous vehicles, it’s too early to tell how much of that interest will be retained as consumers become aware of the real-world limitations of autonomous technology. Volvo’s test program in Sweden might give the first real feedback on this topic.

 

Cloud Connections Bolster In-Vehicle Systems

— January 26, 2015

With the average transaction prices of new vehicles in the United States hitting nearly $35,000 at the end of 2014, drivers can be grateful that the cars they purchase are also more durable and reliable than ever before. The average age of the more than 200 million vehicles on the road in the United States today is now nearly 11.5 years.  However, that longevity has a big potential downside: as computing and communications technology marches on to improve safety, efficiency, and reliability, many of those existing cars will be incapable of participating in these advances.  Luckily, cloud computing could come to the rescue.

According to Navigant Research’s report, Autonomous Vehicles, full-function self-driving vehicles aren’t expected to be available in significant volumes until late in the 2020s.  Until the fully self-driving car arrives, we’ll have a steady stream of incremental improvements in advanced driver assistance systems.  Thanks to increasing connectivity in vehicles, we’re also less likely to be stuck with the capability that was built-in when the vehicle rolled off the assembly line.

No Car Left Behind

General Motors (GM) and Audi are among the manufacturers that are already building 4G LTE radios into many of their new vehicles.  When this capability is combined with advanced new microprocessors from companies like NVIDIA and Qualcomm, vehicles will be able to leverage cloud computing infrastructure to get smarter as they age, rather than being left behind.

At the 2015 Consumer Electronics Show in Las Vegas, NVIDIA unveiled a new generation 256-core processor, called the Tegra X1, along with electronic control units powered by this advanced chip.  One of the problems that driver assistance and autonomous systems have to solve is being able to recognize and distinguish the objects detected by all of the sensors on new vehicles.  The human brain is remarkably adept at distinguishing the nuances between an animal and pedestrian or an ambulance and a delivery van.

Detection before Failure

This sort of image recognition is far more difficult for a computer, so the Tegra X1 is designed to collect image data from its 12 camera inputs and transmit it back to data centers where it can be aggregated with information from other vehicles.  By combining data from many vehicles, the object recognition can be dramatically improved, and updated image libraries can be fed back to vehicles for improved onboard sensing ‑ even without changing hardware.

GM is also harnessing the power of the cloud to provide drivers with predictive diagnostic information for their vehicles using OnStar.  Available for more than a decade, OnStar provides subscribers with vehicle health reports when faults are detected.  Now, by monitoring critical systems such as the battery, starter, and fuel pump and sending this information back to the cloud, OnStar is able to detect subtle changes in performance that have previously been shown to be precursors to component failures.  The OnStar Driver Assurance system can then notify the driver so that an impending problem can be corrected before the driver is left stranded on the side of the road.  This predictive diagnostic system will be available on several of GM’s 2016 model year vehicles.

As automakers roll out new infotainment interfaces, such as Apple CarPlay and Google’s Android Auto, drivers will also benefit from improved voice recognition that leverages massive data centers run by these technology companies.  More robust and reliable voice control will help reduce driver frustration and keep their attention on the road ‑ at least until the car can take over completely.

 

How Will Self-Driving Vehicles Find Their Way?

— January 8, 2015

Google continues to push the technology for its autonomous vehicle, but some recent articles in the media have been more about the detailed mapping required than any of the other technologies that may be necessary for bringing such vehicles into production.  Google is not the only company interested in this angle.  Nokia’s HERE subsidiary is also putting a lot of effort into making high-definition maps that combine detail about roadways with information about traffic flow.

Google has decided that its vehicle must have a detailed map of the roads it will travel on, accurate to a few centimeters, with detailed knowledge of the exact location and height of the curbs, not just the lane markings.  Recognizing that these vehicles must also cope with construction and temporary obstacles, HERE is exploring the idea of using the cloud to store the digital map data and having it updated on a continual basis.

A Perfect Map

The concept relies on huge amounts of data being constantly uploaded and downloaded to the cloud so that all vehicles always have a highly accurate digital map of their surroundings to rely on.  While this is clearly one potential solution for the future of autonomous vehicles, it’s a concept thought up by two large companies that have already invested heavily in scanning and mapping technology.  It’s natural to find solutions that match the tools already available, and all the better if the solution requires a tool upgrade.

In a previous blog, I wrote about self-driving vehicle developments in China.  It seems to me that putting more intelligence into each vehicle, to deal with real-time traffic issues, is a more practical option than requiring a highly accurate database of all the world’s roads that is updated minute by minute.  Existing digital maps can be used to provide direction just as they do for human drivers today, and powerful, intelligent sensors can monitor the local traffic and obstacles in real time.  I suspect this is how the major automakers are moving forward with autonomous vehicle technology – and why nobody has yet jumped on Google’s offer of a partnership.

 

Sensor Technology Not Yet Ready for Self-Driving Cars

— December 30, 2014

According to Navigant Research’s report, Autonomous Vehicles, some limited self-driving vehicles will arrive by 2020, but widespread adoption of full-function autonomous vehicles won’t happen until at least the late 2020s.  Over the next 15 years, manufacturers are expected to continue making incremental progress with more capable assist systems and semi-autonomous systems, such as the Super Cruise system recently announced by General Motors and the Advanced Highway Driving Assist from Toyota.

Many of the vehicles sold today already contain most of the essential building blocks to enable them to operate autonomously.  However, a new study from AAA indicates those pieces are not yet all that reliable or consistent.  Based on that, it’s reasonable to deduce that we cannot yet trust those systems for full automation, and drivers must remain fully engaged in vehicle operation.

AAA recently conducted a series of tests of blind spot monitoring and lane departure warning systems and found that the performance can vary widely among different vehicles and under different conditions.  “AAA’s tests found that these systems are a great asset to drivers, but there is a learning curve,” says John Nielsen, AAA’s managing director of Automotive Engineering.

Enhancement, Not Replacement

Automakers have always marketed these features as assist systems meant to augment rather than replace the control of an attentive driver.  For example, the lane-keeping assist that is part of many such systems can automatically provide some correction to help prevent the vehicle from drifting out of a lane.  However, these systems typically also monitor the driver using sensors in the steering wheel or torque feedback in the steering column to prevent hands-off driving.

Advanced Driver Assist Sensors: Ford Fusion

 

 

(Source: Ford Motor Co.)

Similarly, blind spot monitoring sensors don’t negate the need to check mirrors regularly while driving.  The sensitivity and field of view of each vehicle depends on where the manufacturer has positioned the sensors behind the rear bumper cover.  Each of the vehicles tested by AAA would detect vehicles or cyclists in the adjacent lanes at different times.

My own experience driving a wide variety of vehicles equipped with both types of assists has been as spotty as the results from AAA indicate.  Lane departure systems use digital cameras and sophisticated image processing algorithms to look for lane markers painted on the pavement, and all of the systems currently on the market only function at speeds above 35 mph to 40 mph.   The problem arises when the lane markers aren’t clearly visible or don’t exist at all.  On a rural road or residential street with no markings, you’re completely on your own.

Alerts and Alarms

Another problem is the lack of consistency in how alerts are presented to drivers by different manufacturers.  Vehicles can have audible, visual, haptic alerts, or a combination of these.  Sometimes, the systems are overly sensitive and trigger so many alerts that drivers are tempted to disable the system to avoid being annoyed, thus defeating the purpose.  Other times, the monitors don’t provide an alert until it’s too late to be useful.

Sensing systems will need to be robust enough to provide accurate warnings or control inputs under all driving conditions, and designers will have to develop human-machine interfaces that provide information to drivers without being overwhelming.

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Electric Vehicles, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Smart Grid Practice, Smart Transportation Practice, Smart Transportation Program, Utility Innovations

By Author


{"userID":"","pageName":"Autonomous Vehicles","path":"\/tag\/autonomous-vehicles","date":"3\/28\/2015"}