Even Driving a Billion Miles Can’t Prepare Driverless Cars for Everything

The same words seem to be reiterated at every trade show: “Autonomous vehicles must drive a billion miles before deployment.”

This assumes that there truly is safety in numbers and that driverless cars can mimic, even surpass, human driving abilities just by spending more time on the road. Philip Koopman, an associate professor at Carnegie Mellon University and co-founder/chief technologist of Edge Case Research, isn’t convinced by that assumption.

He thinks it is time for start-ups and carmakers to consider a strategy other than driving as many miles as possible. “A billion miles isn’t enough because brute force testing doesn’t make you safe,” said Koopman. “No matter how many billion miles you test, there’s always going to be something weird you haven’t seen and, if you’re relying on only dealing with the things you’ve seen before, every time there’s something you haven’t seen, all bets are off.”

Koopman speaks from experience. One of his personal automotive anomalies includes an engine that fell out from under his vehicle while he was driving. The same thing happened to his brakes when both hydraulic loops fell out. He also experienced a tire blowout and a shattered windshield after an icicle fell on to the car. Worst of all he endured a life-threatening scare when a blue tarpaulin fell from a gravel truck crossing an overpass, floated down to the highway below and covered his entire vehicle.

“If you have autonomy that’s never been taught what to do with a blown-out tire, it’s going to have a problem,” said Koopman. “Much of current car safety is based on the belief that the human will do the right thing. If you don’t teach autonomy that, now your underlying car isn’t safe anymore because you don’t have a human there to do the right thing.”

Koopman doesn’t know how many drivers have experienced falling tarpaulins, failing brakes or engines but he can’t imagine he’s the only one who has encountered these issues. He still thinks these and other rare problems need to be addressed, however, even if they are one-in-a-million occurrences. “You can’t be perfectly safe and there will always be something but if three major things have happened to me personally and your autonomous vehicle can’t do any of them, I could have been killed three times. The really weird stuff happens a lot more frequently than you think. If you have 100 million cars on the road, the ‘once every million miles’ stuff happens several times a day!”

Driving vs. structured argument

All of this comes down to Koopman’s point that endless AV miles are not enough to build a better vehicle. He does not believe a car will ever have the knowledge of every possible scenario. Instead he would like to see automakers make a structured argument detailing what those individual miles actually mean and what the car will do when it encounters new obstacles.

“There’s a thing I call ‘heavy tail distribution,’” said Koopman. “If you have things happening every 100 million miles but there’s millions of them, even if you fix something, you’ll never see it again and it will be something different next time. The airplane people figured this out. At some point every crash was something different, so fixing it didn’t stop the next one. That’s when the FAA switched from fixing stuff that they found out about to proactively being much more aggressive, saying, ‘This could go wrong, we’ve never seen it happen but it could, so we’re going to fix it before it happens’. That’s where you need to be to reach the huge numbers between mishaps.”

Teaching a new machine old tricks

Machine learning is thought to be the key to building a better automobile but machines currently lack the logical thought and reasoning that will be needed for them to replace human drivers. Perception is a big part of this and it’s also one of the biggest challenges.

“Machine learning is not traditional software that you can just take a human and look at it and know if it’s right or not,” said Koopman. “It doesn’t work that way. Think about traffic laws: when is it okay to put two wheels over the center line? Right now the answer is, ‘Don’t be a jerk.’ It’s hard to teach that to a machine.”

Eventually carmakers will have to develop a master list of all potential scenarios, including potential problems and hazards – even natural disasters. “What about frozen bridges and rain?” Koopman suggested. “What about sandstorms, smoke, wildfires and tornadoes? At some point there has to be a master list and all the car companies should be on the same page.”


Leave a comment

Your email address will not be published. Required fields are marked *