Monday 4th January 2016
The self-driving car, that cutting-edge creation that’s supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers.
The glitch? They obey the law all the time, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit.
As the accidents have piled up – all minor scrape-ups for now – the arguments among programmers at places like Google and Carnegie Mellon University are heating up: should they teach the cars how to commit infractions from time to time to stay out of trouble?