Haptics and HMI

Automakers today may want to listen closely to a soulful tune from back in the day where the woman sang: “My baby must be a magician, 'cause he's sure got the magic touch!” That’s because when designing embedded infotainment environments, touch has turned out to be a much more powerful sense than expected.
It has turned out to be more immediate and intimate than anyone in the business previously imagined, certainly much more so than either aural or visual. Understanding how tactile signals loop to the brain is proving key to making driving the connected car not just a more fulfilling and pleasurable experience, but more importantly, a much safer one.
Touchscreens are ubiquitous in cars these days and wonderful as they are for providing drivers with information, simple choices, and utter ease of operation, they also come with one very big drawback: they are distracting to the driver. Interacting with one requires the driver to take his or her eyes off the road sometimes for several seconds while performing tasks. Anything more than a glance starts getting risky.
If the driver is on a highway and spends just two and a half seconds looking at a screen, they've traveled half the length of a football field, enough to begin drifting out of a lane.
It goes without saying that since telematics is ultimately about better driving, and that touchscreens aren't likely to ever go away, then it's imperative that visual interfaces in cars be engineered to reduce the amount of time a driver spends looking and interacting with them to an absolute minimum. That means controls need to be intuitive and compelling to operate so that a good sense of completion accompany each completed task.
One way to do this is providing sensory feedback to activities. Normally it's done with sound or visual signals. These cues work, to an extent, by making a sound or displaying a symbol whenever a selection gets made. But depending on your focus or other conditions, you might not notice it. The sound might not be heard if there's noise inside the car or during rush hour when outside ambient noise is also high.
Likewise, if the eyes are away from the dash, the visual symbol might not be noticed. Either way, the brain might not fully consider the action completed. The problem with audible and visual signals is that they come into the part of the brain known as the cerebrum, where higher-level cognitive functions take place, but which also tends to filter out stimuli that it deems unimportant. This often includes sounds and visual feedback signals.
But haptics can change this. “Haptics is the science of applying touch feedback while interacting with devices,” explains Shreharsha Rao, Haptics Products sector manager for Texas Instruments. “Haptics are especially useful for the automotive space because they provide OEMs with a compelling user interface application.”
Touch is a much more immediate sense than either sound or light. That's because it loops into a very old part of the brain that is basically reptilian. “It's a part of your brain from a long time ago, your defense mechanism. If someone touches you or pokes you, you'd better pay attention,” says Chris Ullrich, VP for User Experience at Immersion Corporation, a company that specializes in haptics.
“Touch has that unique property over audio. Audio can also have that impact, but it requires things like lack of background noise. A touch creates some kind of tension for the user. So from our perspective, touch has that unique property which is quite interesting, especially in situations where you're driving, where you might need that level of reaction time or that level of attention.”
Using tactile stimulation, haptics induce an understanding of a task completed into the brain at a very basic, nearly non-cognitive level, which is good, because it makes it all the more fulfilling when it is completed.
Ullrich says numerous studies show that on touchscreens equipped with tactile feedback, tasks are performed much faster and with less secondary glancing, than flat panels either unequipped for feedback or equipped for just sound or visual. It also makes these displays more accessible in visually demanding or low-vision situations.
But Ullrich says that the real magic of touch isn't that it provides feedback better than sound or light, it's the way haptics work in conjunction with these two other senses. He says multi-modal is the way to go. “For me, the bottom line is, tactile is good. Tactile is better than no tactile. But tactile plus other modalities is even better.” Though scientists have known about the power of tactile for years, this awareness has only recently started making its way into the connected-car space.
As a result, he says Immersion is often asked to come in on designs where tactile and audio are presented as separate options. One of the challenges they face educating OEMs and infotainment designers to understand the multiplier effect that multi-mode feedback brings to the driver.
“There's a whole bunch of stuff, a whole bunch of tiers involved in the equation,” he says, “a bunch of different people with their agendas, the specs around that are not that well understood in that space. So one of the biggest issues for Immersion is education. It's going around to the different players in the space and saying 'this is what research says.' We've actually created demonstrations with touch screens and they see that latency and synchronization is so critical for users to be able to make meaning from an experience by multi-modal cues.”
Of course once all the different players do start grasping the effectiveness of multi-modal feedback, a much bigger set of challenges suddenly presents itself. When employing combined audio and tactile feedback, it is absolutely imperative that the two are properly synchronized. If the sound and the haptic effects are separated by as little as 500 milliseconds, the human brain will perceive them as two separate things, instead of being synchronous.
The problem is not necessarily that difficult when it’s an embedded system where all the different parts are made to work together, but once outside systems are brought into the mix, then synchronization can get tricky. This is especially the case with wearables.
“One of the problems we see is that there are all these different vendors of wearables and almost all of the wearable devices have some kind of haptic feedback in them, but zero effort has been made to make this feedback consistent,” says Ullrich.
“So if you're a user and you're getting some kind of alert from your tablet or your smartwatch, the same exact thing might be presented in a completely different way, there is really no consistency. We're trying to create a more standard meaning and definition for a variety of basic haptic signals.”