logoalt Hacker News

andy99yesterday at 7:57 PM0 repliesview on HN

The issues with the three laws aside, being able to state rules has no bearing on getting LLMs to follow rules. There’s no shortage of instructions on how to behave, but the principle by which LLMs operate doesn’t have any place for hard rules to be coded in.

From what I remember, positronic brains are a lot more deterministic, and problems arise because they do what you say and not what you mean. LLMs are different.