Robots But Not as We Know Them Jim!

Louis Bedigian explores what robots should, and shouldn’t, look like in an automotive HMI future.

The human machine interface (HMI) is set to become very important to the cars of tomorrow. Whether visual, such as displays and other graphics, or aural, voice-controlled assistants like Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana and Google’s Assistant, consumers appear to want an experience that is advanced yet inviting. However, if it’s too human-like, research shows it can make users somewhat uneasy. Yet, should the vehicle appear too robotic, driverless cars might be less appealing or, at the very least, less immersive. It’s a balancing act, to be certain, one that will require years of refinement as automobiles become more autonomous.

“One of the biggest issues the automotive industry faces is how to convey a sense of trust to the driver that the vehicle is making the right decisions,” said Sanjay Dhawan, president of Harman’s connected services division. “Interaction methods must show the intelligence behind the machine, so drivers understand the choices being made by the vehicle. It also needs to provide the driver with a sense of control while in autonomous mode, offering micro adjustments like position in the lane or passing vehicles.”

In addition to that, Dhawan expects cars to feature a growing number of driver-facing monitoring solutions, such as cameras detecting eye gaze, emotion and even cognitive workload. He believes that this will help create HMIs that are simpler, more intuitive and implicit in interaction because the car will understand the state of the driver/passengers and tailor the interface accordingly. “For example, the driver monitoring system will be able to detect the driver’s emotional state and adjust the music/system interface,” said Dhawan. “Or understand if lowered attention and engagement require control taken by the vehicle to a semi-autonomous state.”

Brian L. Gallagher, CEO of Andromeda Interfaces, a company that designs HMI display solutions, thinks it’s important to gradually ease consumers into the robotic cars of tomorrow. He explained how eliminating the pain points of traffic could comfortably demonstrate the potential of autonomous technology.

“I think one of the biggest burdens that everyone has is dealing with traffic,” said Gallagher. “It’s something you have to deal with, especially in major cities, so why not allow the vehicle to take over control? It takes over a problem that nobody wants to experience. Once the traffic clears, the user can take over. Then there would be a human symbiotic relationship with the machine. That’s really where you can start solidifying those relationships and build on that to get to full autonomy.”

Getting personal

Annie Lien, an independent consultant advisor in autonomous vehicles, expects self-driving cars to be built with likable personalities that improve communication with passengers. This would allow vehicles to feel a little more human, creating the perception of having actual drivers in the front seat.

“When the car becomes more and more of a true smart robot, I think people are going to enjoy the experience more if it has a personality to it,” said Lien. “Especially if you’re talking about ride-sharing and whatnot. I think that’s going to come into play – that a lot of HMI designers are going to try and make some sort of personality. Maybe it will be customizable – a choice in personalities. I think people would love it if it was funny and had humor.”

In current vehicles, onboard assistants simply aren’t personable and lack insight into the driver’s environment. Dhawan said that will change in the future as voice-controlled assistants gain more knowledge about driver behavior, the vehicle itself and its surroundings. This will allow the vehicle to “make better implicit decisions versus the driver explicitly prompting the vehicle to take action,” he said.

“Voice personal assistants will also improve with more sophisticated voice-synthesis engines like Google’s WaveNet,” Dhawan added. “Interfaces will sound more natural with improved colloquial speech, including the ability to mimic pauses or the stress and intonations in speech. Advanced customized voice-synthesis engines will be able to adapt to the driver’s dialect or environment. This will allow the driver and passengers to seamlessly interact with the system as if it were another passenger in the vehicle.”

The new cockpit

Mark C. Boyadjis, global connected car lead at IHS Markit, foresees two new cockpits for HMI – one that’s more traditional, another that’s distinctly unique. He said each will be defined by the car’s level of automation.

“If you have Level 4 or 5, and it’s never in another mode, then a complete departure from the interiors that we know is plausible and possible,” said Boyadjis. “Large displays, no steering wheels, no pedals. Maybe some seatbelts, things like that, depending upon the legal framework but, ultimately, that’s what we classify as a non-traditional cockpit. What’s interesting is, while it’s getting 99% of the buzz, it’s probably going to be 1% of vehicle production for the foreseeable future.”

On the flipside, many consumers have already experienced the evolution of the cockpits for human-driven cars. “It is getting more and more developed,” Boyadjis added. “There are larger displays, really interesting things with speech and gesture recognition and driver monitoring. Ultimately, it’s still the vehicle interior that we’re all used to as a driving public.”


Leave a comment

Your email address will not be published. Required fields are marked *