Locating The Car's Brain

BMW and Mercedes-Benz have semi-autonomous offerings on the market, and seven other OEMs are on track to have highly automated systems that include traffic jam assist and automated highway driving by 2020, according to Frost & Sullivan.

No one really wants to use the A-word, that is, "autonomous." Ask the folks who are plotting their roadmaps when we'll be able to fall asleep and let the car take us where we're going, and they respond, "Never – except maybe for niche situations." They're much more comfortable with the current "levels" concept – even though they don't always agree on which level is which.

"Automakers face they are still trying to figure out what autonomous means in the near future, at the five-year and ten-year marks," says Kumar Krishnamurthy, a partner in the IT practice of Strategy&. Some automakers are moving to driverless cars driven by external pressures such as the hype around Google’s well-publicized work on self-driving cars. Others, he says, are taking more targeted approaches, focusing on one thing at a time, such as automatic parking. And at least one automaker he's talked to is betting that drivers will always want to drive. "Where those strategies are going to land is unclear," he says.

General Motors' Super Cruise will be introduced in one 2017 Cadillac model, and GM is a case in point for the iterative introduction process.

"We're doing Super Cruise as a big step toward autonomous driving," says John Capp, director of global safety strategy for product engineering at General Motors. The foundation was the driver safety package that's now almost standard in all Cadillacs. Capp says that Super Cruise builds on some of the sensors, radars and cameras by letting them communicate with each other. "It's a realistic next step."

GM's Super Cruise provides an illustrative example of how ADAS features are merging. According to Capp, much of the input from the sensors is going to a central computer powered by Nvidia processors. This central computer compares data from different sensors to improve reliability.

On the other hand, Capp says, "There's also a trend where some of these ADAS features have become simplified and commoditized, so some of the brains will be on the individual sensors. This will enable us to deploy them more broadly and put them on more vehicles."

And then, there are Tesla and Google. Tesla has begun shipping cars with the hardware for what it calls "autopilot," with the aim of delivering the functionality later over the air. Tesla autopilot will include active safety features to avoid collisions from the front or sides, as well as lane departure avoidance and safe lane changing with activation of the turn signal. Even the hyperbolic Tesla specifically says that this will not enable driverless cars.

Google, as always, is going its own way; Frost & Sullivan analyst Prana T. Natarajan expects the search advertising company to release a fully autonomous aftermarket system in 2018.


Locating the brain

According to Natarajan, automakers that are actively developing highly automated driving systems – aside from Google — are using nearly identical arrays of sensors. The differentiation, he wrote in a recent report, is likely to be in reducing the cost of sensors and refining the automated driving experience. Eventually, more sophisticated algorithms will reduce the number of sensors needed.

OEMs are taking different approaches as to where those algorithmic "brains" will be: On individual sensors, within sensor modules or in a central computer.

Delphi, which needs to be agnostic in its approach to sensor fusion in order to serve its variety of customers, offers a module that handles sensor fusion within the component. Its Radar and Camera Sensor Fusion System (RACam) integrates radar sensing, vision sensing and data fusion into the module, enabling a suite of active safety features that carmakers can choose from, including adaptive cruise control, lane departure warning, forward collision warning, low speed collision mitigation, and autonomous braking for pedestrians and vehicles. It recently announced that the Volvo XC90 will use RACam for its advanced driver safety package including automatic braking at intersections.

John Absmeier, director of Delphi's Silicon Valley Lab, says that RACam is simply a packaging choice, not a technology strategy: RACam also can facilitate fusing sensor data in a more central computer. He adds that one strategy is not better than the other. "Each of our customers has their own vision and roadmap for their architecture. We have to be flexible in that regard," Absmeier says – although he adds that providing this in a single package makes it more cost-effective and easier to integrate with the rest of the vehicle.

Bosch is another tier 1 that is putting some of the brains on sensors. In Bosch's 2015 Chrysler crash prevention package, the long-range radar sensor is the centerpiece of the automatic distance and speed control systems. The radar contains two levels of processing, one for the sensor data and another for the actual functions, including adaptive cruise control and forward collision warning.

Still, "there's no single answer to the sensor fusion question," explains Jan Becker, director of engineering, automated driving, chassis systems control for Bosch. For example, some Audi models have Bosch systems with two radar sensors; the data from both is integrated via one of the radars. For upcoming systems, this data integration will typically remain a function of the radar. However, Becker says, "For higher automation, we do foresee currently that it will require significant change. For those systems, we are developing future electronic control units that will do the processing of all the sensor information to provide 360 degrees of visibility."

Aggregating data from multiple sensors and then processing it within a central computer to provide an accurate picture of conditions is the third approach. Danny Shapiro, Senior Director of Automotive at Nvidia, says this approach increases accuracy and reliability by eliminating false positives, especially with the addition of vision recognition software. But this approach takes immense computing power. Nvidia's idea is one "supercomputer" constantly crunching data from the car's myriad sensors. "Having the ability to centralize that creates a more reliable system and a more efficient system from a total cost perspective," he says.

He does not mean that there will be a single computer handling all of the car’s functions, however, and infotainment will probably not merge with advanced-safety or autonomous features. "There will be a blurring of the lines," Shapiro says. For example, the ADAS system's front-facing camera with computer vision might read speed limit signs and display that information on the instrument cluster or head-up display. In such a case, the infotainment system is part of the driver-assist system.

According to the Nvidia executive, it will be a while before a more open infotainment system is directly linked to vehicle controls. That's partly the legacy of the way automakers are structured, with different departments handling these functions. But there are also safety and security concerns, he says. The infotainment system may have weaker security and could be used as an entry point for hackers gaining access to vehicle controls remotely.

At this year's Consumer Telematics Show Klas Bendrik, Group CIO & VP of Volvo will give an exclusive insight into Volvo’s ‘Drive Me – Self Driving Cars for Sustainable Mobility’. Its the biggest gathering of connected car execs at CTS. See the agenda here.



Leave a comment

Your email address will not be published. Required fields are marked *