Risks for Insurers in Rush Towards Driverless Tech

Definitions are often important – no more so when it involves safety.
This is why there was a call for evidence in August 2020, and a confirmation by the UK Government confirmed in April that automated lane keeping systems (ALKS) technology “could be legally defined as self-driving, despite insurance industry concerns around safety”, says the Insurance Times. Matthew Avery, director of research at Thatcham Research, explained in the article that “there is still a lot of work needed by both legislators and the automotive industry before any vehicle can be classed as automated and allowed safely on to the UK roads. ALKS as currently proposed by the government are not automated. They are assisted driving systems as they rely on the driver to take back control”.
Clearly defined technological path
He says the Society of Automotive Engineers (SAE) has created a clearly defined technological path to automation, which involves a five-step development of technology can functionality. At Level 1 there is adaptive cruise control and at Level 5 there is full automation which permits a vehicle to go almost anywhere without human intervention and without driving controls for a human driver. He explains that at present most vehicles offer Level 2 assisted driving to enable “an engaged driver to take their feet off the pedals and take their hands off the steering wheel for brief periods of time”.
This may involve enabling a vehicle to stay in a lane, maintain a safe distance from the car in front, and momentarily control the vehicle under the driver’s direction. “A popular example of this is Tesla’s Autopilot function, which is not actually automated whatever drivers and marketing spiel may suggest,” he emphasizes.
At Level 4, automation allows the driver to safely disengage. In certain circumstances, such as on motorways, Avery says the driver could let go of the control to watch films, read a book, or even go to sleep. “The car is in full control, and able to safely navigate any obstacle or hold-up – including roadworks and inclement weather,” he explains before adding: “This Level 4 automation meets peoples’ expectations of automation: e.g. the car can do what a competent and engaged driver can do. The problem with Automated Lane Keeping Systems is that they are Level 3 technology, which is defined as conditional automation.” So, at Level 3, the vehicle can in certain situations navigate and it can drive without the oversight of the driver. This is nevertheless restricted to a certain speed and conditions. It therefore has its limitations.
Safety limitations of ALKS
ALKs can’t move a vehicle between lanes, even in the case of an emergency. The driver always needs to be ready to take back control.
He therefore argues: “This fundamentally contradicts our view of what automation should be, in that the doesn’t replace the driver. The safety limitations of ALKS suggest that the car would not be able to respond to a driver emerging from a broken-down vehicle, or automatically change lanes to avoid debris, nor could it find safe harbor in the event of an incapacitated driver. So, the driver of the vehicle will potentially be exposed to danger and could cause a hazard to other drivers, as their vehicle could be left stationary in lane if the driver fails to take back control within 10 seconds.”
Avery believes the safety questions are compounded by the vehicles ability to operate in specific environments. Automated driving systems such as ALKS may not, for example, be able to read specific UK road signage – leading to traffic rule infringements, which could lead to a driver being fined.
Insurance liability
He says there are also challenges around insurance liability, particularly in fully autonomous Level 5 vehicles, raising questions about whether a ‘driver’ is liable, or whether a manufacturer or technology provider is liable when something goes wrong. This will affect decisions about who pays out for what. So, there will need to be access to vehicle and driving data to fully understand who was in control. Was it the system or the driver at the time of a collision? Without this data it would be impossible to identify fault and liability.
Speaking from an insurance perspective, the Association of British Insurers (ABI) says it supports the development of autonomous driving technology in its widest sense, claiming that the technology has the potential to greatly improve road safety. However, owing to misunderstandings in the recent past about Tesla’s Autopilot systems that has led to some fatal accidents because they aren’t meant to be fully autonomous, the ABI believes it’s vital for drivers to have a “clear understanding and expectations of what the technology can and cannot do, and what actions by them will still be needed”.
A spokesperson for the ABI comments: “While the insurance industry fully supports the development towards more automated vehicles, drivers must not be given unrealistic expectations about a system’s capability. It is vital that ALKS, which rely on the driver to take back control, are not classed as automated, but as assisted systems. By keeping this distinction clear we can help ensure that the rules around ALKS are appropriate and put driver and passenger safety first.”
Driver education
Siddartha Khastgir, head of verification and validation, intelligent vehicles at WMG, University of Warwick in the UK agrees with the viewpoints expressed by ABI and Thatcham Research in the feature article about public’s perception of automated driving technology being a key barrier to the successful roll-out of ALKS technology. “While I wouldn’t go as far as classifying ALKS as ADAS, I will definitely say that driver education is key to ensuring safe use of ALKS and other automated driving technologies,” he says.
This why the WMG is calling for ‘informed safety’ for the users of automated driving systems, which is about informing the users about the “true capabilities and limitations of the technology to the driver”. Unfortunately, this has rarely been done to date. He adds that there is a need to prevent any misuse of the technology. Problems arise when users overly trust systems which have limited capabilities. So, by making them aware of the systems limitations it should be possible to prevent accidents.
He adds: “One of the fundamental things to be communicated to the user for an ALKS is the constrained Operational Design Domain (ODD) of the ALKS. ODDs are fundamental to safety of ALKS and other automated driving systems.” By creating an understanding of the ODDs, drivers or users can fully understand the capabilities of the automated systems to use them safely. “For example, an ALKS has its ODD as motorways and should not be used or get activated while on urban roads”, he warns.
Thatcham Research has worked with its EuroNCAP partners to define three clear states to enable the average driver to understand automation. Avery concludes that Level 3 systems are open to too many questions, which leading to drivers misunderstanding the capabilities of the systems – citing the incidents involving Tesla cars as an example. He, therefore, argues that automation needs to be abundantly clear and “robust enough to accommodate the vagaries of real-world traffic such as roadworks, fog, or broken-down vehicle.”. Furthermore, vehicles can only be considered as being fully automated when they operate using Level 4 systems.
So, for now, he says Thatcham and UK insurers believe there should be no moves beyond today’s assisted driving technology because it offers all the benefits with fewer risks. He argues that Level 3 automation should become the next ‘genetically modified crops’ scandal, causing consumers to reject the benefits because poorly designed or immature systems could lead to vehicles crashing.
He, nevertheless, concludes that automation will bring significant societal benefits. For them to be realized, he believes there is a need to wait until the systems can cope with all driving scenarios, just as an engaged driver would. With careful attention to defining connected and autonomous vehicle technologies, this will be achieved – safely. By educating defining these technologies, it becomes possible to understand their limitations, and then to inform drivers about them. With a clearer understand of their capabilities and limitations, there will be fewer accidents and insurance claims.