Are self-driving cars really safer? In the eight months that Google’s 50 self-driving cars have been on the road, there have been four accidents, reports the AP. But according to the sources quoted in the story, half of these accidents were really caused by human error—not the car’s robotics at all.
Chris Urmson, the director of Google’s self-driving car program fired back against the report in a Medium post on Monday, saying that in the six years since the start of the project in California and Nevada, there has only been “11 minor accidents (light damage, no injuries).” Urmoson stipulates that “not once was the self-driving car the cause of the accident.” Still, most details remain unclear because accident reports are protected under California state law.
According to Urmson’s post, the self-driving cars have been rear-ended by other drivers seven times and also have been hit by another vehicle rolling through a stop sign at least once.
The human errors do back up some controversial comments made Tesla CEO Elon Musk made in March. Speaking at a conference, Musk told the audience that humans driving cars will eventually become illegal.
“It’s too dangerous,” Musk said. “You can’t have a person driving a two-ton death machine.”
Jalopnik, Gawker’s auto-focused blog, simultaneously applauded and chided Google for attempting to answer some questions the AP report raised. The site’s writer Damon Lavrinc noted that although the post cleared up some confusion, calling it “a solid step,” more transparency is necessary if the company wants to truly push self-driving cars into mainstream light.
Photo via Bradley P Johnson/Flickr (CC BY 2.0)