Before there is fully automated self driving cars there need to be decisions made. Not technical but ethical.
When there is no chance to avoid a collision and there are only two options which incoming car to hit, who will code what decision my car will make?
- Should my car hit the big expensive SUV because they have the best chance to survive? This is rewarding people with money.
- Should my car hit the one driven by the old lady instead the one where the baby is sitting in the back because the lady is already old? Selection by gender, race, age and so on??
- Should my car hit the weak old junk car because that increases my own chances to survive? Why penalise poor people?
- Should my car hit the one that is driven by one person only instead of hitting the car with 5 people in it? This is penalising car pools.
- Should my car hit the concrete pole instead of the incoming car with a family sitting in there? I would never buy a car that is programmed to give up my life to save other four lives.
Now who is gonna make those decisions? Programmers? The Ford CEO? A political commission? Lawyers? And based on what?
I agree wholeheartedly with what you're questioning, and the values that someone else places on our lives, and it reminded me of a discussion I had with my uncle...
"I don't like flying in airplanes, because I'm not in control, like I am with driving," Said I.
"You really think you're in control when you're in a car?" He responded.
"Yes. I can pull over, I can dodge things, I can avoid accidents. In an airplane, I'm totally at the mercy of the pilot" I said, confidently rebuking him
"That is true," he responded, "but what about all of the other drivers? Do you have control over their fight with their spouse, how much they had to eat, drink, sleep, in the last 24 hours, and there are at least 10,000 people you are on the road with on the way to work. You have control over them too?"
His wisdom made me re-think what I was thinking, and now flying doesn't bother me so much.
And I stay out of the "wolf packs" on the roads. I'm that guy that drives between them.
As for your points, those are all true (the carpool one may need to be reversed to "penalizing one driver cars" - heck, they could be the last one to drive home from the car pool...), and there is one caveat to these points:
It's a risk management situation. If all of those points are true, and it is an emergency case, where the car's software must make a choice, then it has to. However, if the chance of getting into an accident with a random consequence (as we have now) is 1.12:100,000,000 miles driven (as cited by others on this discussion), and it goes to 0.112:100,000,000 (a 10x increase in safety), then these cars are "safer", and the social issues can be mitigated by the reduced risk.
As I write that, I look at how cold that is, and there is something gnawing at me that it isn't right, but it's the facts of the matter. At some point, with self-driving cars, someone has to make that choice, and as long as there is a choice of whether we go out into the world, that risk has to be taken into account.