On the Road with Future In-Vehicle Sat Navs

Artificial intelligence (AI) and augmented reality (AR) are already riding shotgun in the cockpit of some connected vehicles.
Now industry executives are planning the next advancements they say will enhance real-time navigation systems on the road to fully autonomous automobiles. Digital map updates that provide detailed information about a vehicle’s route such as lane counts and merging traffic, construction disruption and parking spot availability will ease confusion and aid the driver’s decision making.
Researchers at MIT and the Qatar Computing Research Institute have created a model using satellite imagery to tag road features in a bid to improve GPS navigation in all of these cases, as well as planning and providing disaster relief when road conditions may be dramatically altered. They say machine-learning models used on satellite images can also help with roads that are partially obscured from satellite view due to trees, buildings, or other obstructions. The Road Tagger “uses a combination of neural network architectures to automatically predict the number of lanes and road types behind obstructions”. The researchers say the model was 77% accurate counting lane numbers and 93% correct at inferring road types.
German carmakers, meanwhile, are joining forces in their own bid to conquer digital mapping. Audi, BMW, and Daimler have all placed a $3Bn bet on digital mapping business HERE. They’re betting this dynamic digital mapping business will play an important role in as automobiles and their environments become more connected and intelligent.
This type of collaboration has become the norm as automakers and suppliers team up to tackle these technological issues and advance AI and AR innovations more quickly. Manufacturers including Audi, Mercedes-Benz, Tesla, Toyota and Volvo are working with common platforms such as Nvidia’s DriveAR. It uses a dashboard-mounted display overlaying graphics on camera footage around the car.
Sharing the road to enhanced driving with AI/AR
Carmakers are also now using the same platform across their fleet regardless of model. Hyundai recently announced that its entire line-up of Hyundai, Kia, and Genesis models will come standard with Nvidia Drive in-vehicle infotainment systems beginning in 2022.
The company’s senior director of automotive, Danny Shapiro, says this platform is a key component of his company’s vision for creating a software-defined car. Shapiro notes Mercedes is also going to be standardizing Nvidia Drive across all of its models in 2024, allowing customers to unlock different capabilities at the time of purchase or even at a later date. That includes AI, surround sensors, autonomous navigation and parking capabilities, as well as safety and other apps that can be created by OEMs, Tier 1 suppliers, or third parties. All of the technology is designed to be upgradeable.
Shapiro said the platform also aims to bring one of Tesla’s technological triumphs to the masses: “We have expected updates on our phones for a long time but most automakers don’t have that capability, we give the ability for over the air updates.” He notes that the updates create a new revenue stream for automakers while avoiding the high costs of developing different systems for various models. “The differentiation (for models) through software or levels of features can be unlocked, but it’s all software based,” explains Shapiro, adding, “It streamlines development and reduces costs while extending the life of a system.”
This includes future features that Shapiro notes are under development such as AR in the cockpit that projects a heads-up display on the windshield or other screens as well as on mirrors. He adds, “It requires a lot of computing horsepower. If an OEM has not built in head room for sensor data, they won’t have the ability to roll that out in the future.”
Mapping a new route to market
Navigation software firms are also looking to harness machine learning to improve their products’ functionality. TomTom executives are using AI to understand the driver’s context: relationship to other cars, pedestrians, traffic lights, road conditions and more. This data then will help create AR features such as helping drivers navigate a busy highway to safely reach the correct exit in current traffic conditions.
“We see an increased demand for a more personalized navigation experience,” asserts Sergio Ballesteros Solanas, senior data scientist at TomTom. “AI will be able to understand the end-user’s driving preferences; some end-users might prefer longer routes if they are safer and the navigation system would suggest those specific routes. If the system predicts that you need fuel, it could suggest a route that takes you to your preferred gas station and other stops along the way that make the entire journey more pleasant.”
AI is already on the job at TomTom, updating real-time traffic information that compares current data with historic patterns on a live feed that’s distributed globally. It is currently used in the company’s hazard warning services that alert cars in less than five seconds for hazards such as traffic jams or broken-down vehicles. While this is helpful to connected cars now, it will ultimately be the type of indispensable data that self-driving vehicles will require.
Solanas explains that TomTom used to embed speed limit information into vehicle ADAS maps and the traffic signs were identified by hand from photos taken by mobile mapping vans. “The pictures would be tagged in a database and it allowed us, in a very early phase, to automate the process. The database now serves automated driving vehicles with a more than 95% accuracy rate, compared to the 60% accuracy based on camera sensors alone.”
TomTom’s George de Boer said AI is helping nav further evolve from current conditions to forecasting the driving experience: “(AI) will become more obvious in predictive destination technology when recommendations are first based on behavioral driving but evolve more by combining data from calendar integration and other profiles. Driver’s will also notice it as they get even better route advice that goes beyond the current situation, which now looks ahead 15 to 60 minutes – this will only increase.”
Paul Schouten, senior UX designer for TomTom, said that as more powerful hardware and updated software are loaded into vehicles by automakers and suppliers’ cars will understand driver context better and that will open the door to what he calls “a human-technology collaboration.” Schouten added, “AR will be a great interface for that collaboration, as it is a more natural, human-like way of informing the driver. This will make the technology very intuitive and safe, which can be the biggest surprise the end-customer.”