Weekly Brief: Latest Tesla Fatal Crash Demands Tighter Regulation

A fatal accident involving a Tesla Model S has once again raised concerns about the safety risks posed by self-driving technology.
The accident happened in the suburbs of Houston, Texas, where two men ages 59 and 69 went out for a drive in a 2019 Model S around 11:25pm on April 17. Their wives told police they were talking about Tesla’s Autopilot feature as they left. Minutes later, the Model S went flying through their tiny cul-de-sac at a high rate of speed, veered off the road 100 feet and slammed into a tree, igniting a blaze that would take the fire department four hours and 23,000 gallons of water to put out, owing to the extreme flammability of the vehicle’s lithium-ion battery. The police found no driver behind the wheel. One man was in the back, the other was in the front passenger seat. Both died in the vehicle. Professional accident reconstruction specialists concluded that neither man had been driving at the time of the collision.
The plot thickened last Monday when Elon Musk tweeted: “Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD [full self-driving]. Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.” Musk has a history of projecting an air of impunity and impudence when anything with his technology goes wrong or is called into question. The local police force has stuck by its initial findings and requested further information from Tesla. Investigators from the National Highway Traffic Safety Administration and the National Transportation Safety Board arrived in Texas last Monday and are conducting their own investigations.
Critics have long decried Tesla’s misleading promotional efforts around Autopilot, which make it seem as if the technology is capable of full autonomy when in fact it’s little more than a fancy driver assistance system. The carmaker has also faced criticism for under-delivering on safety features that would ensure that Tesla drivers can’t abuse Autopilot. Cadillac Super Cruise, a comparable General Motors technology to Tesla’s Autopilot, has a camera-based driver monitoring system that harnesses eye-tracking technology to ensure that the driver’s eyes are focused on the road. Tesla does not. Tesla does have a weight sensor that detects if the driver’s seat is occupied but that sensor is used only for starting the vehicle and isn’t linked to Autopilot.
Hacked replicated
Musk maintains that Tesla’s primary Autopilot safety feature – a steering wheel that requires continuous hand pressure – is more than adequate to keep drivers and the general public safe. Last week the senior director of auto testing at Consumer Reports, Jake Fisher, conducted an experiment at the organization’s test track in Connecticut. Fisher started a Tesla Model Y in Autopilot mode while sitting in the driver’s seat. He then stopped the car, slid over to the passenger seat, hung a weighted rope from the steering wheel and used dashboard buttons to increase the vehicle’s speed. “In our evaluation, the system not only failed to make sure the driver was paying attention but it also couldn’t tell if there was a driver there at all,” said Fisher.
Hacks like the weighted rope are easy to find online. If you search YouTube, you’ll discover dozens of references to aftermarket devices like “Autopilot Buddy.” You’ll also find dozens of videos of Tesla drivers recklessly flaunting safeguards and pushing their car beyond what they’re capable of safely navigating on their own. Whether or not the two men who died in Texas were engaged in such recklessness remains to be seen. Whether or not Tesla has failed to protect the safety of its drivers and the public is, arguably, more clear cut.
US Secretary of Transportation Pete Buttigieg called for patience as his department gathers information and works with Tesla and law enforcement. Patience is the wrong thing to call for here. NHTSA is currently investigating 24 crashes across the country that all seem to have resulted from Tesla cars in Autopilot mode. A broad pattern of failures and abuses has already emerged, no matter what happened in that cul-de-sac. That pattern demands stricter safety standards now.