Driverless tech in slog through legal quagmire

Legal ramifications of robot killers explored by Siegfried Mortkowitz.

The crash of an Uber self-driving car in Tempe, Arizona, that killed 49-year-old pedestrian while testing the company’s technology did more than just inspire a number of companies in the ecosystem to suspend their autonomous-tech testing programmes, including Uber; it also pushed to the forefront issues of legal and financial liability that had not been formally addressed.

In this specific accident, at first glance, the issues were relatively straightforward. As described by Stanford Law School Professor Robert Rabin: “Under conventional tort law principles, if the safety driver failed to exercise reasonable care in avoiding the accident, Uber would be responsible for the driver’s negligence. If the automated features of the AV failed to note the presence of the victim, the manufacturer of the vehicle, as well as Uber, could be held responsible under product liability principles. If the victim was walking in the roadway and her presence was obscured by darkness at the late hour, she might be found partially at fault. At the extreme, if her conduct made the accident “unavoidable,” she might be fully responsible but this seems unlikely.”

In this case, video footage of the crash showed the on-board technician not looking at the road when the victim crossed the road. In addition, a number of experts in autonomous technology that have viewed the video have been quoted by news outlets as saying that the car’s sensor system should have detected the victim and reacted.

“The victim did not come out of nowhere,” Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles, told the Associated Press.  “She’s moving on a dark road but it’s an open road, so LiDAR and radar should have detected and classified her [as a human]… This is strongly suggestive of multiple failures of Uber and its system, its automated system and its safety driver.”

The company that provides sensors for Uber’s self-driving vehicles, Velodyne LiDAR, immediately placed blame for the crash on the ride-hailing company. Velodyne’s president, Marta Thoma Hall, told Bloomberg Businessweek that she could not understand why the car didn’t detect the pedestrian. “We are as baffled as anyone else. Certainly, our LiDAR is capable of clearly imaging Elaine [the victim] and her bicycle in this situation. However, our LiDAR doesn’t make the decision to put on the brakes or get out of her way.”

Apparently, Uber executives agreed and quickly settled with the casualty’s family. The company might also have been eager for rapid closure after seeing a report by Reuters which stated that the company had reduced the number of LiDAR sensors on its test vehicles when it switched from using a Ford Fusion to the Volvo XC90 SUV’s of the type involved in the fatal accident.

However, if the ongoing investigation by the National Traffic Safety Board (NTSB) finds that the crash was caused by a failure of one of Velodyne’s sensors, it is likely that Uber will sue its sensor supplier to recover part or all of its settlement with the victim’s family.

There is no legal precedent because the technology is in its infancy. However, because autonomous cars are expected to be allowed on British roads by 2021, the UK is currently drawing up driving laws and insurance regulations to cover autonomous vehicles.

In launching a three-year project, the government said that the Law Commission of England and Wales and the Scottish Law Commission will “examine any legal obstacles to the widespread introduction of self-driving vehicles and highlight the need for regulatory reforms”. One of the goals of the review will be to assess who the driver or responsible person for a vehicle will be when that vehicle is in autonomous mode.

The review will also examine where to place civil and criminal liability when there is “some shared control in a human-machine interface”, whether to introduce new criminal offences for “novel types of conduct and interference,” and what impact driverless cars will have on other road users, such as pedestrians and other vehicles, and how they can be “protected from risk”.

In addition, in October 2017 the British government introduced the Automated and Electric Vehicles Bill to ensure that all parties are covered in an accident involving a self-driving vehicle. The bill seeks to extend compulsory motor insurance to include vehicles able to drive in autonomous mode.

This will become an especially thorny issue when Level 3 autonomous cars are launched because it involves driving situations where control of the vehicle is shared between the car’s autonomous system and the driver. A March 2017 report by the House of Lords science and technology committee cautioned that partly self-driving cars could make drivers “complacent and overly reliant on technology” and warned that for Level 3 vehicles the risk may be too great to tolerate.


Leave a comment

Your email address will not be published. Required fields are marked *