I saw an article saying that its inevitable that self-driving cars will be programmed to sacrifice the "driver". If the computer has to choose between a single-car fatality and a multi-car pileup it will always decide to sacrifice the single driver.
Welcome to the future.
You probably read that on Slashdot, and that site is frankly overrun by idiots. Self driving cars will be programmed to avoid hitting things. They will give priority to avoiding small human-shaped objects and big heavy objects approaching you at speed. The scenarios that these idiots come up (a) don't happen in real life, and (b) most certainly don't happen with a self driving car that doesn't approach that kind of situation at excessive speeds.
The solution isn't to decide whom to sacrifice, the solution is to drive careful at reasonable speed to avoid the situation in the first place.
By the way, even what idiotic things are published on Slashdot, you misquoted it badly. The choice is supposed to be between driving your car into a ditch or driving it into a crowd of pedestrians. There was a multi-car (not just multi-car, but 110 or so cars) pileup in the UK recently, with not a single person killed. And the badly injured ones mostly people who left their car after it crashed. The safest way to hit another car is straight on from behind. If you ever get into a situation where a crash is unavoidable, you try to hit the other car as straight as possible so that the cars stay in a controlled straight line and don't crash into other traffic.
In the case of the Darwin award winner this discussion is about, he confused "auto pilot" (a feature that helps you driving a car) with "self driving". The car was not self driving, was never intended to be driven without an attentive driver, and the guy was frankly an idiot who could have killed someone else, and already almost got himself killed about a week earlier.