NTSB Report on Uber Crash May Shake Up AV Testing

Last week’s preliminary National Transportation Safety Board report on the fatal crash involving an Uber autonomous vehicle may have a deeper effect on AV testing than on the technology itself.

State and local leaders have already started to take a harder look at how companies test self-driving cars, and they’re likely to start reining in those practices now, some industry experts say.

The report didn’t draw any conclusions about what caused Uber’s modified Volvo SUV to hit Elaine Herzberg, a pedestrian walking her bike across a road in Tempe, Ariz., on the night of March 18. But it noted that Uber had turned off a built-in emergency braking system to prevent erratic behavior by the car, then relied on a backup driver to watch the road and take over in emergencies.

The NTSB’s full investigation continues.

The findings say less about how well AVs work and more about how companies can keep people safe as they work out the kinks, said Timothy Carone, an autonomous systems expert who teaches at the University of Notre Dame. Dividing driving tasks between automated systems and human safety drivers is a more glaring problem than ever.

“It forces everyone to look at the human-car interaction,” Carone said. “How does the handoff occur? That’s always been a weak spot.”

Humans are bad at supervising automation, because it tends to be a tedious job, said Sam Abuelsamid of Navigant Research.

“If you have to ride around in this vehicle for hour after hour, it’s highly likely to fail,” he said.

Some cities and states, including Arizona, once allowed AV testing without asking many questions or laying down rules about things like handoffs. That’s changing: Arizona had already banned Uber from testing in the state before the company announced last week that it was shutting down its whole testing operation there. And Pittsburgh, where Uber hopes to restart its program soon, has gone from welcoming the company with open arms to laying down conditions such as a 25 MPH speed limit on AVs.

That trend may accelerate.

“I would expect some states are going to revisit the hands-off approach they’ve taken until now,” Abuelsamid said.

While the federal government regulates vehicle safety, states license drivers and make sure they’re ready to hit the road. Evaluating self-driving systems is likely to remain under their control, given the anti-regulation slant of the current administration in Washington, Abuelsamide said. Now states are likely to want more conditions, more reporting and more third-party verification that AV systems work before they’re allowed on public roads, he said. Until then, expect to see more testing on closed tracks.

Companies may invest more in simulation testing, too, an outcome that Nvidia CEO Jensen Huang predicted shortly after the crash in March.

Simulation might be the best way for self-driving systems to learn more complex responses to things like objects in the road, Carone said. If a collision can’t be avoided outright, the software should learn how to minimize harm — in this case, possibly by turning so the car hit just the front of the bike. That kind of fine tuning isn’t possible yet, and it will take place partly through trial and error, he said.

“You have to allow the car to fail,” Carone said. “Hopefully, that’s done in simulation.”

Leave a comment

Your email address will not be published. Required fields are marked *