It is widely accepted that around 95% of road traffic accidents are caused by human drivers. We are fallible, distracted, and potentially dangerous. Our liability behind the wheel is not just the problem of other human drivers, but can also be difficult for autonomous vehicles to contend with.

IDTechEx analysis of California DMV autonomous collisions reports.
From 187 reports, only 2 incidents could be attributed to the poor performance of the autonomous system. That means a staggering 99% of crashes involving autonomous vehicles are caused by human error. It is also worth mentioning that while the vehicles were operating in autonomous mode, 81 out of the 83 recorded incidents were caused by a human, either in another vehicle or as a misbehaving pedestrian.
The results, shown in the graphic, only tell half the story though. When delving into individual cases the ineptitude of some drivers becomes glaringly obvious. Here are four example cases where it would simply be impossible for two autonomous vehicles to have the same collision. These cases are summaries of real incidents documented and available on the California DMV website.
At a traffic light-controlled junction, the autonomous test driver notices the car in front has its reverse lights on. As a precaution, the test driver backs up 20-30 feet. When the lights turn green, the car in front accelerates in reverse, they manage to stop short of the Pony.ai vehicle by 4 feet. Without changing out of reverse, the driver in front accelerates again and hits the Pony.ai vehicle.
Here a Cruise vehicle was operating in autonomous mode, making a left turn at a 4-way junction when a vehicle behind attempted to overtake and carry straight on. The vehicle collided with the front left of the autonomous Cruise.
The Cruise vehicle, operating in autonomous mode was making a right turn at a junction and had right of way. A vehicle approaching from the left failed to stop for a stop sign and collided with the autonomous car.
The autonomous vehicle was being driven by the human test driver in manual mode. The test driver entered a junction under a green light and was hit by another vehicle illegally entering the junction while fleeing from the local police force.
The cases mentioned here are some of the more unusual and have been shared to demonstrate how poor human driving can sometimes be. In the vast majority of the 81 cases where a human driver collided with an autonomously driven vehicle, the circumstances were far less interesting. The most common form of collision was simply being rear-ended while in traffic or stopped. These kinds of crashes are likely caused by either human inattention or distraction.
The point remains though that in the cases given and most other cases, it is highly unlikely that a collision would have occurred if both vehicles were autonomous. Autonomous drivers don’t mistake forward and reverse gears, they don’t take unnecessary risks such as overtakes and running stops signs, they always pay 100% attention, and finally, they won’t flee from the police.
Part of the problem could be assumed behavior, where the human driver expects the autonomous driver to go and they start moving in anticipation, only to hit the still stationary autonomous vehicle. Like being caught out by a learner driver not moving at a roundabout or other junction; sometimes there looks to be a sufficient gap and the driver begins to move, expecting that the learner has gone for it, but the overly cautious learner is still there.
Autonomous and connected vehicles have many advantages over human drivers. They have permanent 360° perception, they can communicate their intentions to each other in advance, and they do not make silly operational errors (such as accidentally engaging reverse). As their maturity progresses it will be hard to deny that they are superior to human drivers, and we will need to take a back seat in the task of driving.