logoalt Hacker News

gwdlast Monday at 8:55 PM0 repliesview on HN

> The last one IMO is more complex than the rest, because LLMs are fundamentally autocomplete machines. But what happens if you don't give them any prompt? Can they spontaneously come up with something, anything, without any external input?

Human children typically spend 18 years of their lives being RLHF'd before let them loose. How many people do something truly out of the bounds of the "prompting" they've received during that time?