> You mean deaths to multiple other people, do you not
I mean deaths the AI predicts for other people, yes
And I'm not saying I would never choose to kill myself over killing a schoolbus full of children, but I'll be damned if a computer will make that choice for me.
> deaths the AI predicts for other people
Isn’t this entirely hypothetical? In reality, are any systems doing this calculus? Or are they mimicking humans, avoiding obstacles and reducing energies in a series of rapid-fire calls?
The AI can also only ever predict that you might die. So how should these predictions be weighed? Say there's a group of five children - the car predicts a 90% chance of death for them, vs. 50% for you if the car avoids them. According to your comments, it seems like you'd want the car to choose to hit the children, right?
What is the lowest likelihood of your own death you'd find acceptable in this situation?
I don't believe any AV software out there attempts to solve the trolley problem. It's just not relevant and moreover, actually illegal to have that code in some situations.
You can't get into a trolley situation without driving unsafely for the conditions first, so companies focus on preventing that earlier issue.