logoalt Hacker News

bcjdjsndontoday at 2:20 PM2 repliesview on HN

But why would the same llm give you wildly different answers EACH TIME you ask?


Replies

pkayetoday at 3:32 PM

There is a parameter in LLMs called temperature that controls creativity/randomness. If you set it to 0 it makes the model deterministic. I think some LLMs expose this as a tunable parameter.

show 2 replies
zdragnartoday at 2:32 PM

Because that's how they work? They aren't knowledge machines, they are random generators.

show 1 reply