> An AI system is a tool and like any other tool, responsibility for its use rests with the people who decide to rely on it
Doesn't that argument backfire though? If I use a chainsaw then to a certain extend I will need to rely on it not blowing up in my face or cutting my throat. If I drive a car I need to rely on that its brakes work and the engine doesn't suddenly explode. If a pilot flies an airplane which suddenly has a technical issue and they crashland heroically save half the souls on board then the pilot isn't criminally responsible for manslaughter of the other half.
Unless there is gross negligence, in any of the above cases, just like with AI, how can you make somebody responsible for a tool failure?
I'm gonna push the responsibility up a level in the ladder:
A competent adult using a tool ought to understand the inherent pitfalls of using that tool.
Chainsaws are dangerous, in obvious and non obvious ways. The tool can operate as designed and still amputate your foot.