logoalt Hacker News

rossjudson12/10/20241 replyview on HN

It is absolutely true that LLMs do not know when to stop.


Replies

natmaka12/10/2024

An adequate prompter (human at the prompt) knows when to stop.