UPDATE: Google has announced, via Backchannel, that it's self-driving cars have been in 11 accidents in the last 6 years, none of which were the fault of the self-driving cars.
Self-driving cars are currently being tested in the US where 4 of the 48 on the road in California have been in accidents since September. The media is reporting that companies like Google could share more details on these accidents.
According to sources of The Associated Press, two accidents happened when the cars were in control, while two with a human at the wheel. The two self-driving accidents were at less than 10mph says the anonymous source. Three of the four were Google and one was Delphi, but both have stated that their cars weren't at fault when the incidents happened.
The issue isn't so much that the automated cars are having teething problems, that's to be expected, it's the lack of details shared publicly, which might be the cause for concern.
Legally the California Department of Motor Vehicles doesn't need to share crash details as they're confidential. Google and Delphi have chosen not to share either, although they do legally have to report accidents to the government. But should they share more with the public? If we're being asked to put our faith in autonomous vehicles, should we be told more about what's happening to them?
Like any new product there will be issues. As long as they're fixed and the final car works perfectly, do we need to know about all the bumps along the way? Drawing attention to every problem may just scare people away from something that's already going to take a leap of faith. Also, what constitutes an accident that needs to be reported? Perhaps there have been many smaller accidents that the Department of Motor Vehicles wasn't made aware of.
So far Google's self-driving cars have covered the equivalent of 15 years of driving, amounting to about 140,000 miles. Two autonomous accidents at sub 10mph isn't so bad, but it only takes one mistake to kill a person. Removing the human from the equation raises the big questions about liability in the event of any accident.
Self-driving cars are coming. Many claim they will make roads safer as machines are more aware, process faster and can see accidents coming. They also remove human error - the tempation to look at the phone when driving, or a lapse in concentration. But this needs to be balanced with an infalliable reliability: if the computer driving your car crashes, then you do too.