Tesla Incident Shows the Challenges of Stopping a Driverless Car

The recent traffic stop of a Tesla that allegedly was speeding on Autopilot with the driver asleep signals a new age of traffic safety that is likely to demand new solutions.

Early on the morning of November 30, California Highway Patrol officers reported that a Tesla Model S was going over the speed limit on Highway 101 in Redwood City, Calif., and the driver was asleep at the wheel. An officer concluded that the Tesla was on Autopilot, and the CHP eventually stopped the car by slowing to a halt in front of it. The driver, hotel executive and Los Altos planning commissioner Alexander Samek, was arrested on suspicion of drunken driving.

Autopilot is not designed for hands-free driving, and it is intended to repeatedly warn drivers if they stop paying attention. If they don’t retake the wheel, it should bring the car to a safe stop and turn on the hazard lights, while Tesla contacts the owner. There’s been no confirmation yet whether Autopilot was on or whether Samek had his hands on the wheel. But according to news reports, the car didn’t come to a stop until after officers spent about seven minutes clearing traffic and forced the system.

On Sunday, Tesla CEO Elon Musk confirmed on Twitter how Autopilot is designed to respond to inattentive drivers. “Looking into what happened here,” he wrote.

Whether Autopilot is a great innovation that prevented a tragedy or a faulty system that should have prevented a freeway chase, it’s clear there are new challenges to face as cars increasingly drive themselves.

Only quick thinking by the CHP officers on the scene stopped a car that might have caused a serious accident, said Mary “Missy” Cummings, a Duke University professor who studies autonomous systems. That should be a warning to both law enforcement and regulators, she told TU Automotive.

“This case really demonstrates we are not ready,” Cummings said in a phone interview.

Other agencies should study the CHP’s response to be prepared for cases involving impaired drivers, including medical emergencies, she said. But Cummings argues the National Highway Traffic Safety Administration should be working harder to prevent them.

If Autopilot was engaged and failed to respond to an inattentive driver, there was a two-part failure, she said.

“This would never have happened if Tesla was doing its job and NHTSA was doing its job,” Cummings said. The agency should set standards for automation systems, including whether they can tell if a driver is distracted or incapacitated, and test them as it does airbags and seatbelts, she said.

That kind of testing would bring new challenges, but the reason NHTSA hasn’t stepped up is that it doesn’t want to, Cummings said. And while the Trump administration is especially anti-regulation, NHTSA under Obama also backed away from the challenge, she added.

On the other hand, Autopilot did enable the CHP to stop the car, said Timothy Carone, a University of Notre Dame professor and autonomous systems expert.

“Great for Tesla that their car was operating in a safe manner and actually slowed down,” Carone said in a phone interview. “If the guy was asleep and drunk, so much the better.”

Regulators should study incidents like this but shouldn’t prescribe specific solutions until the technology is better understood, because it’s developing too quickly, Carone said. Otherwise, they may impose systems that solve some problems while causing others.

One crucial step will be to run through scenarios like this one on test tracks in order to understand how each automation system works, Carone said. It’s one thing to know about one automaker’s system in an easily recognized car, but if another model rolled by with a sleeping driver and CHP tried the same maneuver, its software might not stop for the lead patrol car.

Technology already exists for law enforcement to make driverless or semi-automated cars pull over, and it’s long overdue, both Cummings and Carone said. Digital wireless transmitters in police cars, fire trucks and ambulances could take control of the car and get it off the road with just a few commands, Carone said.

However, it’s taken nearly 20 years for the auto industry to start adopting one such system, DSRC (Dedicated Short-Range Communications), and now automakers are split between that and C-V2X (cellular vehicle-to-everything). The “kill switch” for robot cars may be a long way off.

Stephen Lawson is a freelance writer based in San Francisco. Follow him on Twitter @sdlawsonmedia.


Leave a comment

Your email address will not be published. Required fields are marked *