Weekly Brief: Automakers Must Stop Marketing ‘Dangerous’ Automated Tech

A Tesla driver traveling westbound on US Highway 64 in Nash County, North Carolina, slammed his Model S into a sheriff’s car last week.

The force of the impact drove the sheriff’s car into a nearby state trooper’s car. The sheriff and state trooper were responding to an earlier collision on the highway. Both were thrown to the ground as the three cars smashed together. Neither was hurt and the driver, Devainder Goli, was in stable enough condition to explain that his vehicle had been in Autopilot when it failed to process the police cars parked on the shoulder. As for Goli, he had been watching a movie on his smartphone.

Stories like these have become so commonplace recently as to seem normal. Another month, another collision, another driver placing more trust in Autopilot than it deserves and endangering other people in the process. Autopilot is a Level 2 semi-autonomous driver assistance system. Tesla tells drivers that they need to remain responsive any time Autopilot is engaged. The carmaker simultaneously lulls its drivers into over confidence with its flashy marketing campaign, which calls the $7,000 upgrade that enables Autopilot the “Full Self-Driving” package. The result is what happened in North Carolina last week and what has happened over and over again since 2016, when a Tesla driver using Autopilot T-boned a big rig on a highway in Florida and was decapitated.

NHTSA has now opened 13 investigations into Tesla crashes involving Autopilot. That’s an alarming number, yet no recall has resulted and Tesla CEO Elon Musk has not been reprimanded. If anything, he’s been emboldened to make bigger and bigger promises about the wonders of autonomous technology. His latest promise is that Tesla will deliver full self-driving capabilities by the end of 2020, a misleading claim that continues to indirectly cause accidents like the one in North Carolina last week.

Earlier this month the American Automobile Association (AAA) released a study of six of the top advanced driver assistance systems that claim to safely provide automatic steering and braking. It concluded that the systems were “far from 100% reliable” and found that they disengaged or encountered disruptions on average once every eight miles. That’s once every eight minutes if you’re traveling 60mph, or about once per scene of the movie you might be watching on your smartphone.

The AAA study analyzed BMW’s Active Driving Assistant Professional, General Motors’ Super Cruise, Ford’s Co-Pilot360, Kia’s Highway Driving Assist and Subaru’s EyeSight and called on the industry to get its act together before these technologies go mainstream. Just last week Chevy announced that its refreshed Bolt EV, which is due out in the summer of 2021, will come with Super Cruise. Paul Myles has the details.

AAA can offer warnings from the fringes of the industry but little is going to happen until governments step up with stricter regulations for ADAS technologies and the way they are marketed. Don’t get your hopes up. The US DOT has suggested that carmakers and tech companies are responsible enough to regulate themselves when it comes to semiautonomous and autonomous technology. Likewise, the UK government is poised to legalize hands-free use of lane-keeping systems on its highways up to 70mph, with Level 2 and Level 3 systems like Autopilot being the technology powering that revolution. The good news is that autonomous technology is rapidly improving with time. The bad news is that distracted drivers will be our primary line of defense in the meantime.


Leave a comment

Your email address will not be published. Required fields are marked *