In-Car Assistants in an UX World

The sustained evolution and improvement of user experience (UX) is a key objective in the development of next-generation in-car assistants.

So, in what way are companies approaching the improvement of UX and the use of artificial intelligence (AI) technologies in the development of next-generation in-car assistants and how best should they consider the ‘human element’ when evaluating UX performance and creating more humanized in-car interactions?

Intuitive driving experience

A growing number of car companies around the world now view improvements in UX as a key objective in the development of next-generation in-vehicle assistants and are actively exploring and deploying a range of innovative systems.  One interesting recent example is the Hey Mercedes voice recognition system, available in Mercedes-Benz cars equipped with MBUX (short for Mercedes-Benz User Experience). As Georg Walthart, spokesman for technology issues at the carmaker, explains the integration of what he describes as a natural, conversational voice interface including AI in its vehicles allows the driving experience to become even more intuitive.

“The customer does not have to learn specific commands.  The system understands almost all sentences concerning infotainment and vehicle operation.  For example, ‘will the sun be shining tomorrow in Miami?’ is now as easily understood as ‘Do I need sunglasses tomorrow in Miami?'” he says. “It is not the human who has to adapt to the machine, but the other way around.  Indirect speech is also recognized, for instance the user can say ‘I am cold’ instead of the clear command ‘temperature to 24 degrees’ in order to operate the climate control.”

According to Walthart, the system is also capable of learning.  For example, on the one hand it attunes to the user and her voice and also understands non-native speakers better; on the other hand the software models on the server learn new buzzwords or adapt to changing use of language with time. The system also no longer answers stereotypically but varies in the dialogue output too.

In terms of technology, Hey Mercedes is a hybrid system, using both on-board software, mostly to control vehicle operations including lighting or activation of the heads-up display and cloud-based software equipped with speech-to-meaning and deep meaning capabilities, as well the ability to recognize complex sentences.  For example, if a user says: ‘I’m hungry, show kid-friendly Italian restaurants in San Francisco with four or more stars that have free wifi and free parking’ the system will show places to eat within these parameters and enable the commencement of navigation with a single click or voice command.

Human element

Elsewhere, German mobility-as-a-service (MaaS) company Share Now, has recently introduced a number of features to improve the UX of its customers, including dedicated airport parking guidance and a refuel pop-up, which notifies users when the charging or fueling level drops below a threshold level and directly navigates to a nearby partner station. Also a park-search-detection service, which employs an algorithm to detect when users are searching for a parking spot, before triggering a pop-up to help users navigate to a nearby mobility station straight away.

“We improve the UX in our cars on an ongoing basis, often based on direct customer feedback.  One example is the park search feature in some of our cars that supports our customers with the challenge of finding a parking spot,” says Niklas Merk, spokesperson for Share Now.

In addition to the further improvement of intelligent speech recognition, Walthart identifies intuitive gestures as another potential means of improving vehicle operation that will be explored in greater detail in the coming years. In moving towards the establishment of such systems, he reveals that the Mercedes approach to improving the human element is to ensure that customers do not have to learn gestures first but rather that intuitive human movements are interpreted correctly by the car and trigger the desired function.  One example is the automatic illumination of a passenger side reading light when a driver reaches over to a bag on the passenger side.

“Another is the seating options on the infotainment system.  If the passenger reaches to the screen, the passenger seat on the display is emphasized as it can be assumed that he will want to adjust his own seat.  The same goes for the driver when he reaches to the screen,” says Walthart.

“We believe that generally intelligent voice and gesture control will be very important in the future.  Our approach is that the customer does not need to learn new commands or gestures but can intuitively get the functions of their car to work.  Over time, these functions will keep improving, especially as we have the chance to add to the functions quickly,” he adds.


Leave a comment

Your email address will not be published. Required fields are marked *