Volvo Cars’ Chengdu car plant powered by 100% renewable electricity; 65% hydropower
Study demonstrates association between personal exposure to air pollution and vascular damage

IIHS study finds autonomous vehicles may prevent only about 1/3 of all crashes if AVs drive like people

Autonomous vehicles might prevent only around a third of all crashes if automated systems drive too much like people, according to a new study from the Insurance Institute for Highway Safety (IIHS).

It’s likely that fully self-driving cars will eventually identify hazards better than people, but we found that this alone would not prevent the bulk of crashes.

—Jessica Cicchino, IIHS vice president for research and a coauthor of the study

Conventional thinking has it that self-driving vehicles could one day make crashes a thing of the past. The reality is not that simple. According to a national survey of police-reported crashes, driver error is the final failure in the chain of events leading to more than 9 out of 10 crashes.

The Institute’s analysis suggests that only about a third of those crashes were the result of mistakes that automated vehicles would be expected to avoid simply because they have more accurate perception than human drivers and aren’t vulnerable to incapacitation. To avoid the other two-thirds, they would need to be specifically programmed to prioritize safety over speed and convenience.

Building self-driving cars that drive as well as people do is a big challenge in itself. But they’d actually need to be better than that to deliver on the promises we’ve all heard.

—IIHS Research Scientist Alexandra Mueller, lead author

To estimate how many crashes might continue to occur if self-driving cars are designed to make the same decisions about risk that humans do, IIHS researchers examined more than 5,000 police-reported crashes from the National Motor Vehicle Crash Causation Survey. Collected by the National Highway Traffic Safety Administration, this sample is representative of crashes across the US in which at least one vehicle was towed away, and emergency medical services were called to the scene.

The IIHS team reviewed the case files and separated the driver-related factors that contributed to the crashes into five categories:

  • “Sensing and perceiving” errors included things like driver distraction, impeded visibility and failing to recognize hazards before it was too late.

  • “Predicting” errors occurred when drivers misjudged a gap in traffic, incorrectly estimated how fast another vehicle was going or made an incorrect assumption about what another road user was going to do.

  • “Planning and deciding” errors included driving too fast or too slow for the road conditions, driving aggressively or leaving too little following distance from the vehicle ahead.

  • “Execution and performance” errors included inadequate or incorrect evasive maneuvers, overcompensation and other mistakes in controlling the vehicle.

  • “Incapacitation” involved impairment due to alcohol or drug use, medical problems or falling asleep at the wheel.

The researchers also determined that some crashes were unavoidable, such as those caused by a vehicle failure like a blowout or broken axle.

For the study, the researchers imagined a future in which all the vehicles on the road are self-driving. They assumed these future vehicles would prevent those crashes that were caused exclusively by perception errors or involved an incapacitated driver. Cameras and sensors of fully autonomous vehicles could be expected to monitor the roadway and identify potential hazards better than a human driver and be incapable of distraction or incapacitation.

Crashes due to only sensing and perceiving errors accounted for 24% of the total, and incapacitation accounted for 10%. Those crashes might be avoided if all vehicles on the road were self-driving—although it would require sensors that worked perfectly and systems that never malfunctioned. The remaining two-thirds might still occur unless autonomous vehicles are also specifically programmed to avoid other types of predicting, decision-making and performance errors.

In the crash of an Uber test vehicle that killed a pedestrian in Tempe, Arizona, in March 2018, the automated driving system initially struggled to correctly identify 49-year-old Elaine Herzberg on the side of the road. Once it did, it still was not able to predict that she would cross in front of the vehicle, and it failed to execute the correct evasive maneuver to avoid striking her when she did so.

Planning and deciding errors, such as speeding and illegal maneuvers, were contributing factors in about 40% of crashes in the study sample. The fact that deliberate decisions made by drivers can lead to crashes indicates that rider preferences might sometimes conflict with the safety priorities of autonomous vehicles. For self-driving vehicles to live up to their promise of eliminating most crashes, they will have to be designed to focus on safety rather than rider preference when those two are at odds.

Self-driving vehicles will need not only to obey traffic laws but also to adapt to road conditions and implement driving strategies that account for uncertainty about what other road users will do, such as driving more slowly than a human driver would in areas with high pedestrian traffic or in low-visibility conditions.

Our analysis shows that it will be crucial for designers to prioritize safety over rider preferences if autonomous vehicles are to live up to their promise to be safer than human drivers.

—Alexandra Mueller

Resources

Comments

AndrewsRobert

This may be a self serving piece for the insurance industry. Autonomous vehicles are already better at virtually all of the categories above. The issue today is control.

As long as we are in control, we limit the success of autonomous vehicles. Once drivers can give control over to autonomous vehicles, virtually all of the categories above are handled better by autonomous vehicles.

Because we are still in control, we force autonomous vehicles to push the safety boundary to the level of risk we accept. Once we are truly passengers, we will be happy to let the vehicles set the pace.

If it is legal for me to check my social media/watch a movie or play a game on the way. I will not even Care what speed we are going.

Tesla is already seeing better crash rate drops today. This will dramatically improve once the autonomous vehicles take over.

SJC_1

I don't want AI to drive like a person, the "I" stands for intelligence.

dursun

"if AVs drive like people", who's fault would that be.

The comments to this entry are closed.