Giving people more ways to shut themselves in the foot does not improve the safety. I find the entire thing a kind of dark pattern as the system along with misleading marketing makes you lax over time just to catch you off guard.
You get used with the system to work correctly and then when you expect less it does the unthinkable and the whole world blames you for not supervising a beta software product on the road on day 300 with the same rigour you did on day one.
I can see a very direct correlation with LLM systems. Claude has been working great for me until one day when it git reset the entire repo and I’ve lost two days work because it couldn’t revert a file it corrupted . This happened because I just supervised it just like you would supervise a FSD car with “bypass” mode. Fortunately it didn’t kill anyone , just two days of work lost. If there was the risk of someone being killed I would never allow a bypass /fsd/supervise mode regardless of how unlikely this is to happen.
they have very good guardrails to prevent you that, unlike autolane etc.
Teslas has sensors , eye trackers etc is it possible to shoot yourself in the leg, sure. But not in any different way vs human doing irrational things in the car, make up, arguing , love etc.
Human-being is an irrational create that should not drive except for fun in isolated environment. Tesla or Waymo or anyone else.... It is good to remove human from the road, the faster the better.