logoalt Hacker News

InsideOutSantalast Sunday at 2:54 PM1 replyview on HN

I would assume that if the model made no assumptions, it would be unable to complete most requests given in natural language.


Replies

stingraycharleslast Sunday at 3:38 PM

Well yes, but asking the model to ask questions to resolve ambiguities is critical if you want to have any success in eg a coding assistant.

There are shitloads of ambiguities. Most of the problems people have with LLMs is the implicit assumptions being made.

Phrased differently, telling the model to ask questions before responding to resolve ambiguities is an extremely easy way to get a lot more success.