Google's self-driving cars, previously thought to be better than human drivers, seem to be just as capable of causing an accident as the rest of us.
In February one of Google's self-driving cars was involved in a collision with a bus in Mountain View, California, according to Re/code. The autonomous vehicle, a Lexus SUV, was sitting at an intersection waiting to make a right-hand turn. Some sandbags were blocking the lane, so the vehicle moved left to avoid the blockage and, in doing so, collided with a city-owned bus.
Google's cars have been involved in 17 accidents since the project launched in 2009, and up until now, the company has maintained that in each of these instances its cars have not been the source of the crash.
So how did it happen? In a released statement reported by Re/code, the company explained how its intelligent car failed:
Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.
Some perspective: Nearly 30,000 fatal car crashes occurred in the U.S. in 2014, according to the National Highway Traffic Safety Association, and more than 2.3 million people were injured in car crashes.
Google tends to defend its technology at all costs, noting in its statement that "this type of misunderstanding happens between human drivers on the road everyday." But the point of this new technology is that it should be better than human drivers, not just as good. This becomes an especially important consideration as self-driving cars have more of a presence on roads.
And that will likely be soon. In February, the National Highway Traffic Safety Administration outlined how self-driving cars could be considered a "driver" by the administration's rules. Though there's no official legislation on self-driving cars, the NHTSA is working on reviewing current laws to make sure there is a path to put autonomous vehicles on roads.