logoalt Hacker News

DrJokeputoday at 4:21 PM1 replyview on HN

> Models aren't deterministic

Is that really true? I haven’t tried to do my own inference since the first Llama models came out years ago, but I am pretty sure it was deterministic: if you fixed the seed and the input was the same, the output of the inference was always exactly the same.


Replies

bigwheelstoday at 4:24 PM

LLMs are not deterministic:

1.) There is typically a temperature setting (even when not exposed, most major providers have stopped exposing it [esp in the TUIs]).

2.) Then, even with the temperature set to 0, it will be almost deterministic but you'll still observe small variations due to the limited precision of float numbers.

Edit: thanks for the corrections

show 3 replies