Should the driver have taken control?
Google prides itself on the safety of its cars. As the company tests them on real roads, safety drivers ride along to take over the control of the car if something goes wrong. However, just last month, one of the cars got into an accident and Google has accepted part of the blame.
According to Google's report, the car was driving along a street, when it moved to the right-hand lane to prepare to turn. However, it detected sandbags near a storm drain and had to stop. As it prepared to re-enter the center lane, it noticed a bus, mistakenly assumed it would stop and collided with it at 2 miles per hour (about 3 kilometers per hour).
“Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day,” Google said in their monthly report for February.
SEE ALSO: Toyota to Collaborate with MIT, Stanford to Develop “Intelligent” Car
As with most technology, there is always room for improvement. This minor accident will help Google increase the safety of their cars and refine their software:
“We’ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. Our cars will more deeply understand that buses and other large vehicles are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”
Luckily, nobody was injured in the accident. Neither vehicle was travelling very quickly (15 miles or 24 kilometers per hour in the case of the bus) and the only damage was vehicular—to the front left fender, the front left wheel and one of the driver’s-side sensors of Google’s SUV.
However, this one accident does lead to more questions. How safe is safe enough for a self-driving car? How many accidents are acceptable? Who is responsible for the car’s action if there isn’t a safety driver present? Could a driverless car get a ticket?
USA Today explains just how complicated it can be to accurately determine how safe a car is:
“In the U.S., approximately one fatality occurs for every 100 million miles driven. To prove with 95% confidence that a driverless car achieves, at least, this rate of reliability by driving them around to see, it would require they be driven 275 million miles without a fatality.”
Given that there are about 100 vehicles available, it would take them 12 years to test their safety to that level even if they were driven 24/7.
Even if self-driving cars aren’t perfect, is it worth putting them on the road if they are shown to be safer than the average human?