logoalt Hacker News

the_dukeyesterday at 6:02 PM1 replyview on HN

You shouldn't be downvoted - LLMs could in theory be deterministic, but they currently are not, due to how models are implemented.


Replies

otabdeveloper4yesterday at 7:29 PM

All my self-hosted inference has temperature zero and no randomness.

It is absolutely workable, current inference engines are just lazy and dumb.

(I use a Zobrist hash to track and prune loops.)