> to minimize other damage
You mean deaths to multiple other people, do you not? Let's just call a spade a spade here and point out the genuine ethical dilemma.
What's the ratio between "bodies of your own kids" and "other human bodies you have no other connection with" in terms of what a "proper" AI that is controlling a car YOU purchased, should be willing to make in trade in terms of injury or death?
I think most people would argue that it's greater than 1* (unless you are a pure rationalist, in which case, I tip my hat to you), but what "SHOULD" it be?
*meaning, in the case of a ratio of 2 for example, you would require 2 nonfamiliar deaths to justify losing one of your own kids
We can take the AI out of the question entirely and ask how many other humans you personally as a driver would be willing to mow down to avoid your own death—driving off a bridge, say.
I would suggest that all but the most narcissistic would have some limit to how many pedestrians they would be willing to run over to save their own lives. The demand that the AI have no such limit—“that the AI will prioritize my life and safety over literally any other concern”—is grotesque.
> You mean deaths to multiple other people, do you not
I mean deaths the AI predicts for other people, yes
And I'm not saying I would never choose to kill myself over killing a schoolbus full of children, but I'll be damned if a computer will make that choice for me.
Yeah, you also have to consider that your kids can be on either side of the equation too.