> Which is of course what the road rules are: you slam on the brakes.
Yeah, there are a shocking number of accidents which basically amount to "they tried to swerve and it went badly".
You can concoct a few scenarios where other drivers are violating the road rules so much as to basically be trying to murder you -- the simplest example is "you are stopped at a light and a giant truck is barreling towards you too fast to stop".
If you are a normal driver, you probably learn about this when you wake up in the hospital, but an autonomous vehicle could be watching how fast vehicles are approaching from behind you. There's going to be a wide range of scenarios where it will be clear the truck is not going to stop but there's still time to do something (for instance, a truck going 65mph takes around 5 seconds to stop, so if it's halfway through its stopping distance, you've got around 2.5 seconds to maneuver out of the way).
That does leave you all sorts of room to come up with realistic trolley problems.
> That does leave you all sorts of room to come up with realistic trolley problems
But all require a human (or malicious) driver on one hand. The more rule-following AVs on the road, the fewer the opportunities for such trolley problems.
And I'd still argue that debating these ex ante is, while philosophically fascinating, not a practical discussion. I'm not seeing a case where one would code anything further than collision avoidance and e.g. pre-activating restraints.