logoalt Hacker News

esafakyesterday at 6:19 PM1 replyview on HN

I don't know if LLMs are trained to imitate sources like that. I also don't know what would happen if you asked it to do something like someone who does not know how to do it. Would they refuse, make mistakes, or assume the person can learn? Humans can do all three, so barring more specific instructions any such response is reasonable.


Replies

Rohansiyesterday at 8:44 PM

> Humans can do all three, so barring more specific instructions any such response is reasonable.

Of course, but reasonable behavior across all humans is not the same as what one specific human would do. An individual, depending on the scenario, might stick to a specific choice because of their personality etc. which is not always explained, and heavily summarized if it is.