Is it responsible to let users do auto speed and auto lane on a high speed highway without other autopilot features ?
Rollout both technologies at scale , and try to guess with one will cause more harm giving th fact there will be users in both cars trying to put legs on a steering wheel :
A stupid tech that will not even try to do safe things
Or software that is let’s say 4x less safe vs avg human but still very capable of doing maneuvering without hitting obvious walls etc ?
Giving people more ways to shut themselves in the foot does not improve the safety. I find the entire thing a kind of dark pattern as the system along with misleading marketing makes you lax over time just to catch you off guard.
You get used with the system to work correctly and then when you expect less it does the unthinkable and the whole world blames you for not supervising a beta software product on the road on day 300 with the same rigour you did on day one.
I can see a very direct correlation with LLM systems. Claude has been working great for me until one day when it git reset the entire repo and I’ve lost two days work because it couldn’t revert a file it corrupted . This happened because I just supervised it just like you would supervise a FSD car with “bypass” mode. Fortunately it didn’t kill anyone , just two days of work lost. If there was the risk of someone being killed I would never allow a bypass /fsd/supervise mode regardless of how unlikely this is to happen.