Navigant Research Blog

Questions Aplenty About Interacting with Automated Vehicles

— March 31, 2017

Over the past 130 years, the interface between human and machine has become relatively standardized. We have steering wheels, pedals, seats, mirrors, and other major controls in roughly the same location no matter what brand or type of vehicle we use. We’ve made adaptations for additional hand controls for those that have physical disabilities, but overall, the experience is consistent. But when those controls are eliminated in automated vehicles (AVs), as Ford will do in the AV it intends to produce in 2021, designers have opportunities to rethink vehicle cabins. And those opportunities raise a few questions about interacting with AVs.

User Experience

The more you know, the more you realize how much you don’t know. As we accelerate toward an era where humans are no longer in direct control of the vehicles we move around in, it’s clear that making a car drive itself is only the beginning of the task at hand. A panel at the recent Automotive Megatrends Autonomous Car conference in Detroit examined some of the questions around the user experience (UX) with automation.

A crucial aspect of the human-machine interface are the seats and what we see as we move through the world. Over the last several years, automakers and suppliers revealed a number of fascinating concepts for cars of the future such as the Mercedes-Benz F015, Nissan IDS, and the BMW HoloActive Touch.

One of the seemingly more appealing ideas about not having to drive is that vehicle occupants could be repositioned so they can interface with each other instead of the vehicle. However, a driving force behind the design of modern vehicles is the need to protect occupants in the event of a crash. They must be properly positioned in order for airbags to provide protection. While AVs are likely to cause far fewer crashes, they will still have to coexist with the more than 1.2 billion vehicles on the road today and for decades to come. That means that unless we ban human-driven vehicles, AVs still have to conform to the same safety standards and seat rotation will be limited to small angles.

Then there is the whole issue of motion sickness. Many people experience physical symptoms when there is a disconnect between what their eyes see and their body feels during motion. If we go from driving to watching or reading during our commutes, this could become a design issue.

Voice Recognition Systems

Another question regarding interacting with AVs: Will we let self-driving vehicles know where we want to go? For all the attention that devices like Amazon’s Echo have received in the past couple of years, voice recognition systems remain frustrating to use. Companies like Google and Nuance have made huge strides in improving the reliability of these systems when they are connected to the cloud, but even the most advanced machine learning systems continue to struggle with natural language semantics and accents. There is an enormous difference between recognizing individual spoken words and the meaning that is imparted by stringing a series of words together.

Humans are remarkably adaptable, and we will likely adjust our own speech patterns to the limitations of the machines before the machines themselves can reliably understand us. Or we may decide that if technology can’t make our lives less frustrating, we may reject it.

 

Alexa Steals the Show at CES 2017

— January 16, 2017

CodeVoice activation took center stage at CES 2017, with Amazon and Alexa as the leading stars. I spent several days at the annual geek fest, and the device kept coming up in multiple conversations with industry players.

Alexa, formally called the Amazon Echo, is not new. The $180 device was first available exclusively to Amazon Prime members in November 2014. Since then, the device (along with its smaller clone, the Dot) and its cloud-based data service have been made available to anyone, steadily gaining a solid foothold in the smart home-Internet of Things (IoT) market. It has become a surprise hit, and vendors across the spectrum now clamor to include Alexa functionality in their devices. Companies like LG, Whirlpool, Samsung, Mattel, Lenovo, GE, and Ford have (or will soon have) products with Alexa voice technology.

From an energy standpoint, Alexa has already made inroads with smart thermostat makers to work directly with their products. For some months now, Alexa has enabled users to simply say a command, and a Wi-Fi-connected thermostat will alter the temperature to a new setting on a Nest, ecobee3, Honeywell Lyric, or Sensi product.

Waiting in the Wings

Even though Amazon’s Alexa is the clear leader of the voice activation trend, Alphabet’s Google Home device was waiting in the wings at CES to carve out its share of the market. The Home device has only been available to consumers since its launch in November 2016, but a number of vendors I spoke with already have products that can work with Home or are planning to add Home integration to their products in the near future.

While Amazon and now Alphabet are competing head-to-head for voice activation in the home, conspicuously absent in the space at CES were Apple and Microsoft—though that could soon change if rumors about Apple are true. Rumblings out of Cupertino indicate Apple is developing its own competitor to Alexa. Microsoft has its own voice-activated assistant engine called Cortana, but it is still unclear what the software giant’s strategy is in this part of the market and whether it wants to join a competitive hardware-cloud battle where it would likely start out as the number four player.

Other Connected Things: Mostly Incremental Gains

I saw mostly incremental advancements for smart thermostats, smart appliances, and numerous connected-smart lighting products on the show floor, which is not meant as a criticism. As manufacturers hone their skills, I would expect to see steady energy efficiency gains among these products as more sensors and data analytics combine to improve energy consumption. This kind of effort is difficult to achieve and takes time to develop.

Nonetheless, there was one notable product in terms of energy efficiency called LaDouche from French startup Solable. LaDouche is a residential water heater, and it was named as a CES 2017 Innovation Awards honoree for its heat exchange capability, which ostensibly can lower an electric hot-water bill by up to 80%. That is impressive (if it can be verified).

Voice Technology as Transformative

The 2017 CES was a showcase for voice technology as a transformative trend, and one that Navigant Research has pointed out as a key new input for the IoT and computing in general. This was CES’ 50th anniversary event, and the show remains one of the few places where transformative technology gets a megaphone and where one gets a glimpse of what potentially lies ahead in coming years—maybe even in the next 50. Flying cars—are you with me?

 

Blog Articles

Most Recent

By Date

Tags

Clean Transportation, Digital Utility Strategies, Electric Vehicles, Energy Technologies, Policy & Regulation, Renewable Energy, Smart Energy Practice, Smart Energy Program, Transportation Efficiencies, Utility Transformations

By Author


{"userID":"","pageName":"Voice Recognition","path":"\/tag\/voice-recognition","date":"10\/19\/2017"}