ADAS is the driverless car’s eyes on the future

With the World fixated, as it is, on driverless cars, it is a little odd so little attention ever gets paid to its constituent technologies, known collectively as ADAS.

While the prophets of the automotive space, predict driverless cars will be the most disruptive technology ever to come down the pike, the same somehow cannot be said of automatic cruise control, lane departure warning, automatic braking, radar, lidar, ultra-sonics and all the other ADAS technologies without which the autonomous proposition can never be fulfilled. But without the vital spark of autonomy, they are dismissed as too incremental and evolutionary to be considered disruptive.

The fact that many, if not most, new cars in Europe and North America feature at least some driver assistance technology almost doesn't matter since most drivers are, at best, only marginally aware of its presence inside their vehicles. It is odd and ironic but perhaps not necessarily a bad thing since it also allows the technologies something they definitely need: the space and time to continue their incremental evolutionary path without exciting any great amount of legislative or regulatory pressure.

According to Jeremy Carlson, ADAS analyst at IHS, the ADAS market segment is being driven mainly by two things. “The first is the increasing rate of innovation in ADAS technology is so far ahead of the automotive product cycle. This is causing the technology to spread downward from what was once just high-end vehicles to less expensive models. It's also driving the costs down and making it affordable.”

ADAS is increasingly about sensors, including the aforementioned plus television cameras, along with the software and algorithms used to bring all the inputs together, fused with the vehicle's acceleration, braking and handling systems to create coherent and usable 'pictures' of the surrounding environment and avoid accidents. To propel this “sensor-fusion revolution” and bring ADAS into the mainstream, a number of global efforts were set up by the automotive industry to push for a harmonisation of technology standards. Chief among these has been something called the “79Ghz Project,” an international industry group which has been pushing governments worldwide to allow automotive radar and vehicle-to-vehicle communications to operate within the 79 gigahertz spectrum band, which has been previously allocated to radio astronomy and certain radio location services. Up until now, automotive radar has had to operate at lower, less-than-optimal, frequencies.

Autonomous vehicles will depend on an array of sensors all working together. While the costs of television camera and radars have been driven down to affordable levels the same cannot be said for laser-based lidar or infra-red or ultra-sonics. Anyone who has seen Google's driverless vehicles has probably notice the overly-large “hatbox” assembly mounted on its roof. That is its lidar, built by Velodyne, which uses several dozen lidars each set at different angles and whirling at high speed to create a highly detailed, 3D picture of the vehicle's surroundings. It is key to proving the viability of the autonomous vehicle proposition. What almost no one realises is that the Velodyne lidar unit costs several times more than the vehicle itself. For lidar to be practicable, the cost has to be brought to way down and this is probably one of the reasons, the horizon for driverless car deployment is permanently five to ten years away.

“When the market for millions of high-resolution lidars gets reached, then the unit cost will drop a great deal but they will still not be as cheap as cameras,” says software architect and ADAS pioneer, Brad Templeton. “The first commercial units will cost several thousand dollars. After that, they'll probably stay at a thousand but it will probably be years before they go below that.” Templeton added that he knows of several projects going on among both Tier One suppliers and some start-ups to develop lower-cost lidars. Along with the advances in hardware, are advancements in software and processing. Sensor-fusion involves both machine- and computer-learning, which requires algorithms and massive levels of data processing at a scale that has not yet been reached.

Carlson identified the second area that is shaping ADAS development are the American and European New Car Assessment Programs or NCAP.  Both programmes give higher ratings to new cars with ADAS technology included in their factory configurations. This is an important boost to ADAS, since it helps increase consumer acceptance and understanding of driver assistance technology.

A 2015 McKinsey connected car survey polled 5,500 new car buyers around the world and found that while 70% of the respondents said they were aware of the ADAS features on their cars, the percentage that had actually tried them out was often less than 30%. On the other hand, those that had used them did report a high level of satisfaction and said that ADAS systems were something they'd definitely buy again.


Leave a comment

Your email address will not be published. Required fields are marked *