logoalt Hacker News

0xbadcafebeeyesterday at 10:06 PM1 replyview on HN

My dude, when people say LLMs are non-deterministic, this is what they mean. You cannot expect an LLM to always follow your prompts.

When this happens, end your session and try again. If it keeps happening, change your model settings to lower temp, top_k, top_p. (https://www.geeksforgeeks.org/artificial-intelligence/graph-...)


Replies

SparkyMcUnicornyesterday at 10:54 PM

temperature, top_k, and top_p don't exist on Opus 4.7 (or 4.6?).

Related: https://xcancel.com/bcherny/status/2044831910388695325#m

https://platform.claude.com/docs/en/api/messages/create#crea...