Changing Up Through the Gears to Level 4

The global automotive sector still remains in the early stages of Level 2 autonomy.

Estimates by Singapore technology market analysts Canalys finds that only 11% of new cars sold in Europe in Q4 2019 and 13% of new cars sold in the US in Q4 2019 had Level 2 driving systems installed.  While not defined by SAE, so-called Level 2+ systems, including driver assistance and support features, are available in some new cars.

However, these are different to Level 3 systems where, once engaged, the system, not the human drives the vehicle under certain conditions until it requests the human to resume control.  As Chris Jones, co-founder, vice-president and chief analyst at Canalys explains, the key technological challenges for the developers of Level 2/2+ systems are to produce systems capable of driving unconditionally, without the human’s involvement.

“Precise vehicle location, HD maps, 360-degree sensing and driver state monitoring are all needed in Level 2+ systems.  Level 3 is a challenge owing to the handover from system to human: will the human be ready to take over when required?” he says.

“Governments will only allow hands-off, eyes-off Level 3 systems to be launched and activated if they are completely satisfied with their reliability and robustness.  Education is also key to ensure drivers, passengers and other road users know the capability of the systems.  Some autonomous driving developers will skip Level 3 completely and focus on Level 4 systems, most likely in the form of city-based mobility-as-a-service vehicles,” he adds.

Elsewhere, Ilya Aristov, engagement manager at global IT consultancy outfit DataArt, thinks that a complex traffic environment and lack of 4G coverage outside big cities are among the biggest challenges that automotive vehicle developers are facing.  In his view, Level 3 autonomy can only be achieved in big cities and in particular conditions.

“Right now, there are Level 3 cars from Audi, Honda, BMW, and others,” he says. “However, driving autonomy is limited to specific conditions.  To refine Level 3, I think we should have more accurate maps, together with a stable GPS signal, to be 100% sure where the car is.”

Aristov also argues the industry should place a strong focus on making the life of pedestrians safe with the help of all sensors onboard and calls for all automotive manufacturers to join forces to improve the decision-making capabilities of the car with the help of machine learning and artificial intelligence.

“If we’re talking about Level 4, cars absolutely have to make us confident that they can work well at Level 3.  It’d be great to enjoy a car ride if I were 100% sure that the vehicle will stop when needed, and my family, or whoever is in and around my car, is safe.  It’s more about trust and time rather than technologies.  I would also keep in mind the joy of controlling the car and deciding when to accelerate or hit a brake,” he says.

Role of AI

In Jones’ view, the emergence of Level 2+ assistance systems through 2020 will further delay the launch of Level 3 autonomous systems, or even stop their progress completely. Also Level 4 robo-taxis trials and deployments in cities around the world will also impact the progress of Level 3 systems. He observes that next-generation Level 2+ systems need precise centimeter-level accuracy, vehicle location, HD maps with road-ahead awareness of incidents and 360-degree vehicle surround sensing from radars, cameras, sonar, LiDAR and software, as well as camera and sensor-based driver state monitoring and 4G/5G-based V2X communication. “Level 4 vehicles will include all this technology, plus there will be a focus on comfort, convenience and connectivity for the passengers, backed up by a robust MaaS platform,” he says.

Looking ahead, he predicts that autonomous driving systems will become more experienced drivers, that more types of vehicle will benefit from the technology, and that urban mobility solutions, V2X and 5G will be key elements of smart city transportation initiatives.  In terms of wireless connectivity, he also observes that vehicles will ‘visually and audibly communicate with other road users and their passengers. AI will also be widely used to help the system drive but will also improve the user and passenger experience.

“The sensors will get better and become less expensive, which will be key in producing Level 2+ systems for mass market vehicles, Level 4 vehicles for individual ownership and robo-taxis for MaaS providers to run profitable platforms,” he says. “The MaaS platforms must offer enough clean, well maintained, well equipped, safe, zero-emission vehicles to ensure passengers must only wait a few minutes for a ride.  With urbanization continuing, successful MaaS solutions will be key to reducing car ownership, easing congestion, preventing road fatalities and serious injuries and improving the environment.”

Meanwhile, Aristov cites computer vision and machine learning as key technologies for Level 3 and points out they should not only work outside the car but also check the driver’s condition inside the car. “As for Level 4, I think it’s fault tolerance and security.  Machine learning algorithms will play the key role.  Cars have enough sensors, and data quality grows steadily.  What we are missing nowadays is the right behavior for self-driving cars, being predictable for all the traffic participants,” he says.

“I’d add security topics here, some new protocol development, or just providing enough confidence that the car won’t be easily hacked.  I’m also very interested to see 5G network development pushing the boundaries of wired connectivity.  Let’s see what we can get when the coverage is sufficient,” he adds.


Leave a comment

Your email address will not be published. Required fields are marked *