That's not true at all. Self-driving cars don't have to have a sense of morality. They "just" need to be significantly safer than human-driven cars.
You want self driving cars to avoid the situations where, e.g., either A or B dies, not make those choices.
Note: this is the same for human drivers. We teach people how to understand and obey traffic signals, how to adjust to weather and visibility conditions, how to signal your intentions to other drivers, when it's safe to merge and turn, etc. We don't teach people how to choose whether to crash into different groups of pedestrians or when to sacrifice ourselves. Did you get that kind of question on your driving test?
The idea that self-driving cars need to understand morality is pure nonsense.