Driverless cars can’t keep their robot heads in the Cloud

The humble on-board electronic control unit (ECU) has been handling data in autos for quite some time now but it could be getting a new lease of life as the industry races towards increasing levels of autonomous functions. IT designers are having now to consider would the old-school black box be up to handling the deluge of data vehicles bristling with sensors can generate.

New paradigm

According to Josh Hartung, CEO at PolySync, designers need to start thinking of autonomous vehicle computing systems as a “completely new paradigm” with the “connectivity of smartphones, the processing requirements of supercomputers and the safety-criticality of the space shuttle” that demands a rebuilding of the understanding of safety from initial principles.

“No matter how much amazing technology these vehicles are leveraging, cognitive AI, incredible LiDAR and massive GPUs, without a solid platform, it's a skyscraper built on [land]fill,” he says. With software now defining more and more of the functionality of cars, Hartung also believes the most important requirement of an ECU platform is a cast-iron assurance that it can't do anything “unsafe” before it can “make the leap into the most sensitive, safety critical systems”. “Think about your phone or personal computer.  When was the last time it crashed or locked up at an inconvenient time?  In autonomous vehicles, these software failures mean real-world crashes. The main requirement of an ECU platform is that it should guarantee the underlying system is always able to respond.  Once we've solved that problem we can think about how the vehicle navigates left hand turns or stops for pedestrians,” he says.

Elsewhere, Gion Baker, global head of automotive at Vodafone Automotive, points out that embedded databases are not necessarily within the development roadmap of vehicle manufacturers.

Instead, he claims that new vehicle architecture is now evolving from the closed carmaker-controlled system to a more open approach to managing high speed and efficient data management that includes in-built future-proof connectivity and over-the-air (OTA) and diagnostics capabilities, as well as robust firewalls and security settings. “An upgradable, scalable architecture will support mission critical applications, as well as free capacity and memory resources for third party app development, offering optimised connectivity to user interfaces and enhanced customer experience,” he says.

Clouding the issue?

Moving forward, Hartung calls on designers to think of an autonomous vehicle as a “hierarchy of time”, where safety-critical operations like decision making, path planning and control that tend to consume the most data, via raw images, LiDAR points and GPS positioning, all need to happen reliably and consistently. “You really need massive on-board power to be able to process all that data and make good, safe decisions every second or so.  In terms of volume, we often see 15-20 gigabytes per second of data, so it isn't really feasible to think about streaming that kind of data to the cloud,” he says.

That said, Hartung still believes there is a role for the cloud in handling other operations, like fleet learning, map updates, traffic updates and weather, that happen a little more slowly, perhaps over ten seconds up to a few minutes. 

“After you've processed all that data on-board the vehicle it's easier to pass up to the cloud.  You could send things like your location, your speed and the name of your driver.  None of that takes massive bandwidth but it's incredibly valuable for many different applications,” he says. “The key is that none of these applications can be safety critical since we can't trust the car to always be connected.  The cloud is really useful where it brings together lots of things but it doesn’t make sense for some things on the autonomous car.”

Baker agrees that real time information needs to be guaranteed for “mission critical” applications and believes that digital high-speed architectures based on LTE-V and 3GPP can ensure that information is “made available at high speed and highly reliability”.  He also believes that “context related data”, covering things like traffic, weather or road works, could be processed in the cloud and help with big data analysis.

Meanwhile, Maxime Flament, head of department – connectivity and automation at ERTICO, believes that “plenty of data” will be collected, some of which will have to be treated “at the edge” (in the vehicle itself), some “at the mobile edge” (in the nearest base station) and some in the cloud. “The best [system] will be the one which can put its processing power and gather its intelligence at the right place,” he adds.

Delivering updates

Looking ahead, Hartung stresses that autonomous vehicles will need OTA updates purely in order to “be considered safe at all”. In his view, although the task of enabling a computer to drive a vehicle well about 90% of the time “isn't that hard”, managing the remaining 10% is “incredibly difficult” because this time is mostly made up of “edge cases” – situations that happen rarely but which could “really throw the algorithms off and lead to unsafe behaviour”.

“When these vehicles are deployed on a large scale, they will be running into edge cases on a regular basis.  The combination of remote connectivity and OTA updates give engineers the ability to modify the software so it behaves better the next time it sees a similar edge case.  The OTA update is actually a mechanism to make the vehicle safer on a regular cadence,” he says. “Since some edge cases are super rare, it's unlikely we'll ever see them in testing, so it takes a fleet to find all the edge cases.  Google's been trying to find them with a private test fleet for almost a decade and they're still finding new ones.”


Leave a comment

Your email address will not be published. Required fields are marked *