Helping Driverless Cars ‘Feel’ the Road

The artificial intelligence that controls highly automated vehicles is starting to go where the rubber hits the road.

Autonomous cars, and automated features like adaptive cruise control, work almost entirely by “seeing” the road ahead and what’s on it. However, Tactile Mobility, a start-up in Israel, has software it claims can determine how the road “feels” to the car. It’s already been added to some commercial fleets and will probably be built into new vehicles by 2020 or 2021, the company says.

The way a car grips the road makes a big difference in how it needs to be driven, as anyone who’s driven on ice, snow or a sheen of water will tell you. Just seeing a surface hazard coming up doesn’t always tell the driver how the car will respond to it. In the same way, cameras, LiDAR and radar may be able to detect a road-surface problem ahead but not fully prepare for it, said Boaz Mizrachi, the company’s founder and chief technology officer.

“You need to feel the road and feel the rain, rather than see it in order to understand the severity of the problem,” he told TU-Automotive. For example, real-time data about how well the tires are gripping the asphalt helps to determine how fast the car can safely drive.

Autonomous vehicle tests have mostly taken place in areas with relatively mild weather, partly because of the way snow and heavy rain can confuse visual sensors. As tests and deployments start to happen in other conditions, tactile readings could emerge as an important part of the equation. Tactile and visual sensors collect different types of readings about the same conditions, so they can train each other, Mizrachi said. His company is already talking with Nvidia’s AV computing business about how to integrate this kind of data into simulations for training a car’s software.

Tactile Mobility doesn’t add any sensors to a vehicle to determine road feel. Instead, it makes software that fuses and analyzes inputs from sensors found in most cars from the past 10 years, Mizrachi said. Some models have more sensors than others but the company claims its software works with whatever they have. The software calculates facts about road conditions and the car itself, both being information relevant to how a car performs.

The biggest challenge in processing all these inputs is filtering out sensor errors and inaccuracies, Mizrachi said. He added that the software separates out the useful data and uses AI and machine learning to analyze it. For example, the software calculates the current weight of a car or truck by combining factors such the car’s velocity and the energy invested in the past five seconds, along with things like elevation, wind speed and drag. On a truck that weighs 20 to 60 tons including payload, the system claims to determine the weight with less than 5% error in the first minute or two of the trip.

For now, the company has implemented its software in an add-on device that’s in commercial use by the three largest truck fleet companies in Israel, said CEO Amit Nisenbaum. Ford has used the technology in a proof of concept related to adaptive cruise control and the company is working with six carmakers in Europe and North America to explore integration with new vehicles, he said.

In addition to helping cars make better driving decisions, Tactile Mobility claims its system provides data that helps cities monitor road conditions. By crowdsourcing readings from many vehicles, the company builds maps showing details of road surfaces and there is also a mode for instantly reporting fast-emerging conditions such as oil spills or rain puddles. The city of Haifa, Israel, has installed it in 10 city vehicles in a trial and is moving toward commercial deployment, Nisenbaum said.

Stephen Lawson is a freelance writer based in San Francisco. Follow him on Twitter @sdlawsonmedia.


Leave a comment

Your email address will not be published. Required fields are marked *