> Waymo is held to a significantly higher standard than human drivers.
They have to be, as a machine can not be held accountable for a decision.
Waymo is not a machine, it is a corporation, and corporations can, in fact be held accountable for decisions (and, perhaps more to the point, for defects in goods they manufacture, sell, distribute, and/or use to provide services.)
The promise of self-driving cars being safer than human drivers is also kind of the whole selling point of the technology.
> They have to be, as a machine can not be held accountable for a decision
This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.
Slowing the adoption of much-safer-than-humans robotaxis, for whatever reason, has a price measured in lives. If you think that the principle you've just stated is worth all those additional dead people, okay; but you should at least be aware of the price.
Failure to acknowledge the existence of tradeoffs tends to lead to people making really lousy trades, in the same way that running around with your eyes closed tends to result in running into walls and tripping over unseen furniture.