Level 3 Handover Still Unsolved Challenge

The road to fully autonomous cars is littered with obstacles. The unleashing of Level 4 robo-taxis in San Francisco caused a bit of chaos and prompted local authorities to ask General Motors to reduce its Cruise fleet in the city by half.

Although Mercedes has launched its Level 3 cars in parts of California, Nevada and Germany, there are at least as many doubters of the technology as there are of these vehicles on the road. This has made Level 3 autonomy a more dubious stage on the road to full autonomy than its presumed successor, Level 4.

The essential difference between the two systems is that Level 3 still depends on the driver, while there is no role for a human being in Level 4, unless it’s the technician who is called when the vehicle malfunctions. The problem with Level 3 is, in fact, the human and the so-called handover, or handoff, when the car is in a situation the technology cannot handle and alerts the driver to take back control within 10 seconds.

In a paper provided by Missy Cummings, professor at George Mason University and director of Mason’s Autonomy and Robotics Center, she describes several situations in which drivers are asked to take control within seconds:

  • Camera vision systems lose the ability to localize due to problems such as missing or faded white lane lines, moisture and/or precipitation in the air, low sun angles and resulting shadows.
  • Missed obstacle detection due to limited sensor capability, such as the ‘sudden reveal’ that occurs when one car is following another and the lead car suddenly shifts lanes, revealing an obstacle in the road ahead. This has caused several Tesla incidents, resulting in both fatal and non-fatal accidents.
  • Erroneous obstacle detection because of the inherent imperfection of sensors, for example, when a LiDAR sensor detects an artificial obstacle, such as a plastic bag floating in the air, causing the car to suddenly engage the brakes. This would cause following cars driven by humans to respond and potentially cause an accordion effect in high-density traffic.

According to Bryn Balcombe, autonomy systems and regulation expert at Oxa, the basic flaw at the heart of Level 3 is the driver. “If you ask a human to become a monitoring system, notoriously we’re not very good at doing that,” he explained. “That is because we lose engagement quite quickly, it’s quite boring. You can’t ask a human to monitor a robot. because they’ll look at the robot and go, after about 45 seconds, according to research, ‘Hey, the road looks pretty good. I don’t need to observe anymore’.”

If that’s true of short journeys, it especially applies to monitoring a machine for longer periods. Waymo famously dropped its work with L3 because its drivers fell asleep. “What we found was pretty scary,” company CEO John Krafcik said in 2017. “It’s hard to take over because they lost contextual awareness.” That included testers sleeping, staring for long periods at their cell phones and even putting on makeup.

According to the MIT Science Policy Review, this weakness is supported by a number of studies that have shown that humans, when put in this situation, are unpredictable at best. Their responses to handover requests are influenced by a number of factors, including age and mood. In addition, traffic density and other complex environments affect the time and intervention strategy chosen by drivers after retaking the wheel. Engaging in non-driving activities while the machine is ‘at the wheel’ also reduces vigilance and takeover time.

Balcombe noted that the handover itself can lead to unsafe situations on a freeway. He posited a situation where a traffic jam has just been cleared up and the traffic is accelerating. “But the autonomous car doesn’t accelerate and it stays at 60kph for up to 10 seconds and you’re behind and looking at that car and saying, ‘Why are you doing 60 for 10 seconds when the road ahead is clear?’ If the driver hasn’t taken control, the vehicle will execute a minimal risk maneuver, which is to slam on the brakes and hand off responsibility back to the human. If you’re in the car behind, that is not a behavior you have experienced on a motorway. I’ve never seen any research or seen any physical testing to show that it is safe to do that.”

Cummings is definitely bearish on Level 3 autonomy. “It is my opinion that Level 3 is only suitable for speeds under 25mph (40kph), so it is OK for a traffic jam pilot but not at highway speeds,” she said via email. She is skeptical about autonomy in general. “I also think Level 4 is only suited for narrow application and Level 5 is a pipedream.” Asked about a reasonable timeline for Level 5 autonomy, she replied: “Real Level 5 that actually works everywhere all the time? Not in my lifetime.”


Leave a comment

Your email address will not be published. Required fields are marked *