The accepted wisdom is that once self-driving technology is perfected for cars, there won’t be any more crashes.
That’s not necessarily the case, according to a new study from the Insurance Institute for Highway Safety (IIHS). The IIHS is an independent organisation in the US focused on research to reduce the damage caused by motor vehicle accidents; it’s a leading provider of crash-test ratings for the American market.
While a national survey of police-reported crashes reveals that driver error is ultimately responsible for nine out of 10 crashes, IIHS research suggests that autonomous technology would only have prevented about a third of them.
Self-driving vehicles benefit from more accurate perception than humans and won’t freeze up in extreme situations – but to eliminate the risk of crashes entirely they would have to be programmed to always prioritise safety over speed and convenience.
“It’s likely that fully self-driving cars will eventually identify hazards better than people, but we found that this alone would not prevent the bulk of crashes,” says Jessica Cicchino, IIHS vice president for research and a co-author of the study.
“Building self-driving cars that drive as well as people is a big challenge in itself,” says IIHS research scientist Alexandra Mueller, lead author of the study. “But they’d actually need to be better than that to deliver on the promises we’ve all heard.”
To estimate how many crashes might continue to occur if self-driving cars are designed to make the same decisions about risk that humans do, IIHS researchers examined more than 5000 police-reported crashes from the National Motor Vehicle Crash Causation Survey.
This sample is representative of crashes across the US in which at least one vehicle was towed away, and emergency medical services were called to the scene.
The IIHS reviewed the case files and separated the driver-related factors that contributed to the crashes into five categories: “sensing and perceiving” errors like driver distraction, “predicting” errors when drivers misjudge what others are doing, “planning and deciding” errors such as inappropriate speed or driving too aggressively, “execution and performance” errors and “incapacitation” involving impairment due to alcohol/drug use or medical problems.
The researchers also determined that some crashes were unavoidable, such as those caused by a vehicle failure like a blowout or broken axle.
For the study, the researchers imagined a future in which all the vehicles on the road are self-driving. They assumed these future vehicles would prevent those crashes that were caused exclusively by perception errors or involved an incapacitated driver.
That’s because cameras and sensors of fully autonomous vehicles could be expected to monitor the roadway and identify potential hazards better than a human driver and be incapable of distraction or incapacitation.
Crashes due to only sensing and perceiving errors accounted for 24 per cent of the total, and incapacitation accounted for 10 per cent. Those crashes might be avoided if all vehicles on the road were self-driving — though it would require sensors that worked perfectly and systems that never malfunctioned.
The remaining two-thirds might still occur unless autonomous vehicles are also specifically programmed to avoid other types of predicting, decision-making and performance errors.
In the crash of an Uber test vehicle that killed a pedestrian in Tempe, Arizona, in March 2018, the automated driving system initially struggled to correctly identify 49-year-old Elaine Herzberg on the side of the road. But once it did, it still was not able to predict that she would cross in front of the vehicle, and it failed to execute the correct evasive manoeuvre to avoid striking her.
Planning and deciding errors, such as speeding and illegal manoeuvres, were contributing factors in about 40 per cent of crashes in the study sample. The fact that deliberate decisions made by drivers can lead to crashes indicates that rider preferences might sometimes conflict with the safety priorities of autonomous vehicles.
For self-driving vehicles to live up to their promise of eliminating most crashes, they will have to be designed to focus on safety rather than rider preference when those two are at odds.
Self-driving vehicles will need not only to obey traffic laws but also to adapt to road conditions and implement driving strategies that account for uncertainty about what other road users will do, such as driving more slowly than a human driver would in areas with high pedestrian traffic or in low-visibility conditions.