logoalt Hacker News

fl4regunyesterday at 9:02 PM1 replyview on HN

if your car doesn' turn left when you turn the steering wheel left, the problem is that the car is broken, if an LLM does something unexpected after you gave it instructions, that's possible when the LLM is functioning entirely correctly.


Replies

TeMPOraLyesterday at 10:23 PM

Nothing in this world is guaranteed. That doesn't mean it's uniformly random either. LLMs can still do something unexpected if you give them clear instructions, but that doesn't mean it'll be arbitrary and unpredictable in scope. The same way C/C++ undefined behavior technically means program can give you nasal demons, but in reality it won't do anything unusual (like format your C:/ drive) unless someone purposefully coded it to do that.

show 1 reply