Automakers Fooling Themselves and Us over Driverless Tech, Expert Says

With electric vehicles approaching a tipping point and monopolizing the news, self-driving cars have been forced into the backseat of public awareness, except when there is bad news.

There has been a lot of bad news recently, much of it involving Tesla and its controversial boss, Elon Musk. Disruptive technology needs to fulfil two essential prerequisites to be widely accepted: have a strong selling point and build consumer trust. Regarding the first point, consumers immediately bought into the raison d’etre of BEVs, that of zero tailpipe emissions, albeit not currently free from fossil fuel pollution. Autonomy’s selling point, eliminating traffic fatalities, while desirable, is considered less urgent.

EVs didn’t require much effort to gain consumer trust, except regarding performance and the charging infrastructure. However, it takes an enormous leap of faith to take your hands off the steering wheel while doing 60mph on a busy four-lane highway. And the trials and actual driving results have been disastrous PR for the technology. Between July 2021 and May 2022, carmakers reported nearly 400 crashes of vehicles with partially automated driver-assist systems.

That’s not all. Tesla has been forced to amend its language around its Full Self-Driving Beta program following the recall of hundreds of thousands of its vehicles after discussions between the NHTSA and Tesla about its FSD Beta features, leaving owners believing the program will never be fully self-driving, as promised. Even more damaging, the company and Musk are now the objects of a class action suit by shareholders who accuse Tesla of defrauding them with false statements about the technology’s safety. If anything, consumers may now be even more wary of the technology than before it hit the road.

For Michael DeKort, founder of IT services and consulting company Dactle, the bad news is no surprise. He believes that the technology is being developed the wrong way and that those responsible for the technology are misleading the public.

Asked to provide a timeline for the launch of Level 4 and 5 autonomy, he replied: “Right now the answer for Level 4 and 5 is literally never, until certain issues are resolved. Level 2 and 3 are reckless and should not exist and Tesla is only the most egregious. The rest are putting people at risk and will either harm people for no reason or lie about their capabilities, especially crash scenarios.”

DeKort has been bearish on autonomous technology for some time not because he feels the task of developing a safe, fully self-driving vehicle is impossible but because of incompetence and dishonesty. “We are not going beyond Level 1,” he declared. The reasons for the failure include “nascent General Learning and Artificial General Intelligence, which means no meaningful inference and leads to an untenable pattern recognition workload.” The alternative, he said, is rules-based development and not classifying objects or trying to establish their movements, “which this domain cannot do due to the complexity”.

Other reasons he cited were the use of human guinea pigs, “many of which have to literally be sacrificed to train or test many crash cases”; relying on the real world as against simulation for most development and testing; and the use of inadequate gaming simulation technology rather than used in aerospace.

The situation is so dire, DeKort said, that almost nothing can be retained from the current approach to autonomy and the only way the autonomous vehicle industry can be saved now is “by doing the opposite of what is being done now”.

In an article, appropriately titled The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology, he wrote that this includes switching real-world testing to simulation and then using real-world driving to validate these results, as well as building trust through due diligence and using proper simulation technology. “Then you make the case for migrating to the real world.” DeKort said it was important to show the public the right progression of due diligence “and take them out of the guinea pig role. This would also increase competition by setting safety bars no one can skip.”

He called for the use of Department of Defense simulation technology and modeling “to create a legitimate [real-world] digital twin by resolving real-time latency, loading model timing and active sensor model fidelity issues with gaming-based systems. This will afford the ability to work on end-state scenarios, the most complex and difficult use cases, on day one”.

Asked how long it would take for carmakers to reach Level 4/5autonomy if his recommendations were adopted, DeKort said: “The right simulation lets them fix the issues if General Learning exists and, even if General Learning is solved and massively lowers the workload, I think it would take a large AV maker like Waymo 10 years.”

One comment

  1. Avatar Pradeep Kumar P 22nd April 2023 @ 4:04 pm

    The autonomous vehicle has a long way to go. Too much hype was created. It needs more research and AI training. How the autonomous vehicle will perform in third world countries, where the road disciplines are not followed, is a good challenge. But before that it needs to perform in countries road discipline are followed.

Leave a comment

Your email address will not be published. Required fields are marked *