logoalt Hacker News

BosunoByesterday at 10:26 PM2 repliesview on HN

The whole idea of just sending "no" to an LLM without additional context is kind of silly. It's smart enough to know that if you just didn't want it to proceed, you would just not respond to it.

The fact that you responded to it tells it that it should do something, and so it looks for additional context (for the build mode change) to decide what to do.


Replies

furyofantaresyesterday at 11:18 PM

I agree the idea of just sending "no" to an LLM without any task for it to do is silly. It doesn't need to know that I don't want it to implement it, it's not waiting for an answer.

It's not smart enough to know you would just not respond to it, not even close. It's been trained to do tasks in response to prompts, not to just be like "k, cool", which is probably the cause of this (egregious) error.

ForHackernewsyesterday at 10:31 PM

> It's smart enough to know that if you just didn't want it to proceed, you would just not respond to it.

No it absolutely is not. It doesn't "know" anything when it's not responding to a prompt. It's not consciously sitting there waiting for you to reply.

show 1 reply