Maps must reflect real-time in a driverless world, says TomTom

In common with many drivers, I have pitted my smartphone’s satellite navigation against the dashboard fitted system in many of the cars I’ve been driving. It always becomes quickly evident that the on-board navigation system fails to live up to the smartphone’s mainly because, even with a spanking brand new motor, the on-board system is at least two years out of date.

That means it doesn’t spot new road layouts and junctions and that failure can result in it suggesting an entirely different, often longer and slower, route than the smartphone’s. This highlights the issue facing carmakers who need something like five years to bring a new car from the drawing board onto the summer’s melting tarmac.

It’s a problem map provider TomTom Automotive is well aware of according to its managing director Antoine Saucier. Speaking to TU-Automotive he stressed the importance of achieving up-to-date map revisions as we advance through the levels of ADAS to a fully driverless future. He said: “For ADAS, it’s more about map attributes or accuracy and for autonomous it’s about HD maps. This not only means high definition, very precise location of the car in that geometry so you know which lane it’s in, but with that comes the challenge of how do you update these HD maps? If you want to drive in an autonomous mode, that means you need near real-time mapping and that means introducing different technologies than what we use for navigation standard maps.”

Saucier accepted that while the autonomous cars of the future will rely on getting near real-time information, TomTom’s service is well on the way to being able to provide this. He said: “This is about infrastructure talking to the car – V2X and the information would come from the organisation working on the infrastructure not unlike that which happens today. For example, the Champs-Élysées is closed one Sunday every month and that comes from the City of Parisand we take that information on board with our maps which shows the road closed.

“However, people working on potholes, it’s going to be extremely difficult to plan with a small truck working on that specific part of the road. What will probably happen is that this will be detected by the first car that has sensors and the information will come from the behaviour of that car, reported and shared to other cars approaching the area. That is where near real-time car-to-Cloud-to-car data comes in to play and it is key for autonomous driving.

“For companies like us, we have to develop our own infrastructure to capture this information, fuse, merge and moderate it to make sure it is relevant before we redistribute it to all the other cars. Of course, a pothole fix may take two hours, so two hours later you need to do it the other way round making sure the work is completed and the road is clear. You do not want to do this too early or too late because otherwise it will generate unnecessary behaviour so the need for near real-time ability is easy to understand.”

Naturally, getting all the required systems in place is reliant on other players particularly those deciding what sensor arrays will adorn the vehicles of tomorrow, Saucier said. “A lot of this we are doing at the moment but the real challenge is how do we scale this up with the amount of data coming from more cars with sensors to make changes that are good and accurate? Well we have this platform now and are developing proof of concept and it’s being tested by quite a number of OEMs and Tier 1 suppliers. However, there is still a question over what will be the sensor mix of autonomous cars? On this there is still a lot of work to do because the sensor mix will dictate what sort of information will be sent to us and that has yet to be sorted out. That said, the platform we have developed will be able to handle a variety of information types and that is already in place.”

[Tele.Myles.2018.03.28] 

 


Leave a comment

Your email address will not be published. Required fields are marked *