Over the past 130 years, the interface between human and machine has become relatively standardized. We have steering wheels, pedals, seats, mirrors, and other major controls in roughly the same location no matter what brand or type of vehicle we use. We’ve made adaptations for additional hand controls for those that have physical disabilities, but overall, the experience is consistent. But when those controls are eliminated in automated vehicles (AVs), as Ford will do in the AV it intends to produce in 2021, designers have opportunities to rethink vehicle cabins. And those opportunities raise a few questions about interacting with AVs.
The more you know, the more you realize how much you don’t know. As we accelerate toward an era where humans are no longer in direct control of the vehicles we move around in, it’s clear that making a car drive itself is only the beginning of the task at hand. A panel at the recent Automotive Megatrends Autonomous Car conference in Detroit examined some of the questions around the user experience (UX) with automation.
A crucial aspect of the human-machine interface are the seats and what we see as we move through the world. Over the last several years, automakers and suppliers revealed a number of fascinating concepts for cars of the future such as the Mercedes-Benz F015, Nissan IDS, and the BMW HoloActive Touch.
One of the seemingly more appealing ideas about not having to drive is that vehicle occupants could be repositioned so they can interface with each other instead of the vehicle. However, a driving force behind the design of modern vehicles is the need to protect occupants in the event of a crash. They must be properly positioned in order for airbags to provide protection. While AVs are likely to cause far fewer crashes, they will still have to coexist with the more than 1.2 billion vehicles on the road today and for decades to come. That means that unless we ban human-driven vehicles, AVs still have to conform to the same safety standards and seat rotation will be limited to small angles.
Then there is the whole issue of motion sickness. Many people experience physical symptoms when there is a disconnect between what their eyes see and their body feels during motion. If we go from driving to watching or reading during our commutes, this could become a design issue.
Voice Recognition Systems
Another question regarding interacting with AVs: Will we let self-driving vehicles know where we want to go? For all the attention that devices like Amazon’s Echo have received in the past couple of years, voice recognition systems remain frustrating to use. Companies like Google and Nuance have made huge strides in improving the reliability of these systems when they are connected to the cloud, but even the most advanced machine learning systems continue to struggle with natural language semantics and accents. There is an enormous difference between recognizing individual spoken words and the meaning that is imparted by stringing a series of words together.
Humans are remarkably adaptable, and we will likely adjust our own speech patterns to the limitations of the machines before the machines themselves can reliably understand us. Or we may decide that if technology can’t make our lives less frustrating, we may reject it.
Tags: Automated Vehicles, Electric Vehicles, Transportation Efficiencies, Voice Recognition
| No Comments »