In 1967, Philippa Foot posed the "Trolley Problem," an ethical conundrum about whether a bystander should be sacrificed to rescue the passengers of a speeding, out-of-control trolley; as self-driving cars have inched toward reality, this has been repurposed as a misleadingly chin-stroking question about autonomous vehicles: when faced with the choice of killing their owners or someone else, who should die?
Mercedes has just answered the question for their own cars: the driver will be saved in every instance, and the bystanders will always be sacrificed.
Predictably, this has gone viral, because, as Jason Kottke writes, "In other words, their driverless cars will act very much like the stereotypical entitled European luxury car driver." Mercedes, meanwhile, offers a pretty plausible explanation for the choice: "If you know you can save at least one person, at least save that one. Save the one in the car."
But as I said, this is an incredibly misleading framework for the problem. Imagine that you have a car that you own, it is your property, and it is designed to periodically murder you — that is its normal, functional outcome, the way it is supposed to behave.
I think it's a fair assumption that a large number of people who owned that product would reconfigure it so that it never deliberately murdered them.
The only way to design a car that murders its owner no matter whether the owner objects is to design that car so that remote parties — the manufacturer, the police, carjackers and terrorists who can successfully impersonate the manufacturer or the police to the car — can override the person who is operating it. That is a terrible idea.
If we're going to convince people to sacrifice themselves for people around them, we need to do so through moral suasion, not by designing murdercars that come with factory installed, I-can't-let-you-do-that-Dave DRM.
I wrote a column about this for the Guardian last year, and I've just finished a short story commission on the same theme that'll be coming out shortly.
Mercedes's von Hugo, then, thinks that the ethical problems will be outweighed by the fact that cars will be better drivers overall. "There are situations that today’s driver can’t handle, that . . . we can’t prevent today and automated vehicles can’t prevent, either. The self-driving car will just be far better than the average human driver," he told Car and Driver.
He also points out that, even if the car were to sacrifice its occupants, it may not help anyway. The car may end up hitting the crowd of school kids regardless. "You could sacrifice the car. You could, but then the people you’ve saved initially, you don’t know what happens to them after that in situations that are often very complex, so you save the ones you know you can save."
Self-Driving Mercedes Will Be Programmed To Sacrifice Pedestrians To Save The Driver [Charlie Sorrel/Fast Coexist]
(via Kottke)