‘Google self-driving car hits bus’ screams the headline. You can just see shudders of delight wracking the defensive minds of Luddites everywhere. They just knew those robotic cars shouldn’t be let onto roads where real men, and yes it’s always men, get to reinforce their sense of self by revving their car and ignoring rules at will.
We’ve got to stop looking for the negative in the idea of computer controlled cars. Sure an accident is possible, but just think how many accidents are caused by us human drivers on a minute-by-minute basis. We’re nothing but a collective mass of inattention, mistakes, and stupidity – none of which computers are prone to. Computers don’t get distracted by phone calls, their attention is not captivated by a cute girl walking down the street, they don’t get road rage, they don’t take their bad day at work out on the accelerator. And all that is why a computer is a much safer bet than you or me behind the wheel of a ton of fast-moving metal.
The biggest problem that Google and other autonomous car pioneers are having is not programming a computer to control a moving car, it’s programming a computer to control a car surrounded by human drivers who don’t act rationally or follow the rules. Self-driving cars will really take off when they are the only thing on the road – that’s when the computers can control not only the individual car but the whole flow of traffic. No jamming on of brakes when there’s an accident up ahead, no screaming off from the traffic light, no confusion over who has right of way, beautifully synchronised merging: That’s the future with self-driving cars.
Self-driving cars make a great deal more sense that letting fallible humans drive anything.
But let’s return to the specific instance of the Google car hitting the bus. The important part of the article is not the news that the car hit the bus, it’s why the human back-up didn’t take control to avoid the accident: “…our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.” In other words the accident would have happened in the same way if a human had been driving.
And, really, should we be holding a self-driving car to a higher standard than we hold human drivers? The answer is no we shouldn’t; but the question is beyond moot because robot cars will intrinsically have higher standards than a human. And that’s why I trust a robot more than you.