One of the problems with moral philosophy is that too often there’s an implicit assumption that there is “an answer.” The Good Place gently deflates this when the script has Chidi write down, moments before having his memory wiped, “there is no answer.” This is the conclusion he reached after living through countless lives (because Michael rebooted him endlessly) in the afterlife. We expect self-driving cars etc. to solve problems that are (i) beyond our capacity, and (ii) unable to be solved by any realistic criteria.
We accept that humans simply freeze and plough straight ahead when confronted with such choices; we want our vehicles to do better. It’s actually this unreasonably expectation that’s the real problem.