Because the operator is liable? Tesla as a company isn't driving the car, it's a ML model running on something like HW4 on bare metal in the car itself. Would that make the silicon die legally liable?
Who’s the “operator” of an “autonomous” car? If I sit in it and it drives me around, how am I an “operator”?
The point is if the liability is always exclusively with the human driver then any system in that car is at best a "driver assist". Claims that "it drives itself" or "it's autonomous" are just varying degrees of lying. I call it a partial lie rather than a partial truth because the result more often than not is that the customer is tricked into thinking the system is more capable than it is, and because that outcome is more dangerous than the opposite.
Any car has varying degrees of autonomy, even the ones with no assists (it will safely self-drive you all the way to the accident site, as they say). But the car is either driven by the human with the system's help, or is driven by the system with or without the human's help.
A car can't have 2 drivers. The only real one is the one the law holds responsible.
Sounds like it's neither self-driving, nor autonomous, if I'm on the hook if it goes wrong.