logoalt Hacker News

jbrittontoday at 4:54 AM1 replyview on HN

It’s kind of interesting relating this to LLMs. A chef in a kitchen you can just say you want PB&J. With a robot, does it know where things are, once it knows that, does it know how to retrieve them, open and close them. It’s always a mystery what you get back from an LLM.


Replies

jbrittontoday at 5:00 AM

Also true of specifications. Anything not explicitly stated will be decided by the implementer, maybe to your liking or maybe not.

show 2 replies