And like I, Robot, it has numerous loopholes built in, ignores the larger population (Asimov added a law 0 later about humanity), says nothing about the endless variations of the Trolley Problem, assumes that LLMs/bots have a god-like ability to foresee and weigh consequences, and of course ignores alignment completely.
Cool!
I work with a guy like this. Hasn't shipped anything in 15+ years, but I think he'd be proud of that.
I'll make sure we argue about the "endless variations of the Trolley Problem" in our next meeting. Let's get nothing done!
Hopefully Alan Tudyk will be up for the task of saving humanity with the help of Will Smith.