What the Autonomous Industry Should Learn from Boeing

Boeing’s 737 Max disasters should be a wake-up call for the automotive industry.

As we move through advanced driver safety to autonomous systems, there are troubling analogies with commercial aviation: complicated software, a lack of communication and training for end users, and people reacting like… people.

Everything worked right in Boeing’s new MCAS but that didn’t prevent the crashes. “It was a complex combination of aerodynamics, software, hardware, human factor, Boeing’s marketing strategy and pilot training process that led to these unfortunate events,” says Victor Haydin, automotive practice lead for Intellias, a custom software engineering company.

One problem was that, while Boeing added an alert for pilots if one of the sensors wasn’t working properly, that alert was by mistake linked to a different, optional feature that not all airlines had implemented. Some airlines said they hadn’t been informed of the optional feature; Boeing disputes this. This shows two things: how difficult it can be to get complicated software completely right and, more important, how difficult it can be for automakers to communicate with customers about features.

Another problem was that the pilots, in a confusing, potentially deadly situation, could not take the proper step of turning off MCAS as they were supposed to. The Boeing crashes show how difficult it is for even highly trained and attentive pilots to make split-second assessments of software or sensor errors and decide to override automated systems. Evidence is mounting that it’s a bad idea to expect a driver to switch from autopilot to full control in an emergency.

Even though automakers do extensive training and testing of semi-autonomous and autonomous systems, it may not be possible to test for and program every possibility in the real world. Haydin says, “There may be a lot of edge cases. The developers who implemented some automation feature might have an assumption the driver might act in a certain way but the driver may react differently. That’s what happened to Boeing. There was clear instruction that in this situation, you have to turn this functionality off but the pilots did not recognize the situation.”

OTA and notification

Some airlines and pilots didn’t get all the information they needed about this software change. OTA updates seem to be the preferred method for carmakers to fix bugs and improve performance of automotive systems. There are no standards for how they should communicate such changes to drivers.

Tesla regularly updates software in its customers’ cars this way, providing a notification on the car’s screen. In April, it rolled out a feature to let the car automatically change lanes without driver confirmation. To make this work, drivers must customize the navigation settings, choosing among three options.

General Motors is following suit. Cadillac executives told Automobile that the company now has the ability to update every major module in Cadillacs over the air. GM declined to comment for this story.

When a safety feature is updated, it will be subject to the same rules and warnings as the original semi-autonomous driving system, because it would be part of that system, says Todd Benoff, a partner in the law firm Alston & Bird who focuses on liability and cyber-security issues unique to autonomous vehicles. “Manufacturers will want to explain what the new driving feature is, and how it works, in order to avoid surprising the driver and causing him to take over when he should not, or, even worse, to overreact by counter-steering into another lane or oncoming traffic.”

Advanced driver training for ADAS?

Boeing has now revamped its pilot training to give pilots “an enhanced understanding of the 737 MAX Speed Trim System, including the MCAS function, associated existing crew procedures and related software changes”.

What about drivers? They get little to no training on the ADAS features of their cars when they drive them off the lot, let alone after they’ve gotten used to the vehicle. In fact, in the United States, most drivers get no training beyond what they need to obtain a driver’s license. We’ve seen Tesla drivers overestimating Autopilot time after time.

CAT Driver Training in the UK launched a program to train human backup drivers for level 4 AVs but this is for professional drivers testing autonomous systems. While some driving aficionados might welcome this kind of course, there’s no way this could become a mandate for drivers of personal vehicles.

Requiring a new car buyer to take a class would turn these advanced safety features from a luxury into an obligation that few would take on but, when a new feature takes over, it can be alarming. Haydin says, “When car starts to beep suddenly and the wheel shakes, I’m not sure people will clearly understand the situation.”

Soft rollouts

No software or automotive system will ever be perfect. Most will be better than a human driver. Still, accidents are inevitable, even with self-driving cars.

Boeing took a huge hit and doesn’t seem to dispute it’s liable. In the case of a semi-autonomous or autonomous private vehicle, liability will be more complicated, Benoff says. “Liability will still turn on the facts of each case; there are just new causes that must be considered. For example, did the new ADAS feature perform the way it was supposed to? Did the driver of the updated car do something wrong? What about the driver of the other car or cars?”

Haydin urges automakers to do gradual roll-outs of new features over the air—after they’ve been extensively tested. He suggests some car owners would agree to be early adopters of the new software.

Finally, Benoff poses a conundrum for automakers: “What if the OEM does not push out the new feature and an accident occurs? The manufacturer could face liability if the new feature would have prevented the accident, or even just reduced the severity of it. So, it’s easy to see the manufacturer getting sued if it does use OTA updates to add a new semi-autonomous feature to a car, and just as easy to see it getting sued if it does not.”


3 comments

  1. Jim Hayes, President, The Fiber Optic Association Inc. 14th June 2019 @ 8:08 pm

    If we are going to have AVs, they will have to be Level 5s and, this is important, be able to avoid the other human drivers. Remember that it will take 25-50 years from the introduction of real AVs to replace even half the cars with human drivers. (The average age of a car in CA is 13 years!)

  2. Alan Mitchell 17th June 2019 @ 8:08 pm

    Another lesson is about redundancy and redundancy management for sensors in the feedback loop that are safety-critical. Boeing’s original AoA sensor feedback to the MCAS was effectively fail-non operational nor safe. Their new proposed fix is fail-safe but not fail-operational, if MCAS operability is considered to be required (if not, why have it?). I believe fail-operational should be required, but that would require three AoA sensors or a sophisticated virtual AoA sensor.

  3. Massimo Plavsic 18th June 2019 @ 7:48 am

    Just let me underline that I’ve already solicited this kind of question immediately after Boeing 747 airplane crashes and what exactly I’ve mentioned is FMEA approach and deep analysis and moreover the robust system that will not allow that something similar can happen.

Leave a comment

Your email address will not be published. Required fields are marked *