logoalt Hacker News

jiggawattsyesterday at 11:13 AM2 repliesview on HN

You wrote your comment one word at a time, with the next word depending on the previous words written.

You did not plan the entire thing, every word, ahead of time.

LLMs do the same thing, so... how is your intelligence any different?


Replies

ben_wyesterday at 11:36 AM

A long time ago I noticed that I sometimes already had a complete thought before my inner monologue turned it into words. A few times I tried skipping the inner monologue because I'd clearly already thought the thought. Turns out the bit of my brain that creates the inner monologue from the thought, can generate a sense of annoyance that the rest of my brain can feel.

Not that it matters, there's evidence that while LLMs output one word at a time, they've got forward-planning going on, having an idea of the end of a sentence before they get there.

show 1 reply
lossyalgoyesterday at 2:17 PM

Tell that to German-speakers, where the verb comes last, and the order of things in sentences is not anything like English, therefore requiring you to think of the entire sentence before you just spit it out. Even the numbers are backwards (twenty-two is two-and-twenty) which requires thinking.

Furthermore, when you ask an LLM to count how many r's are in the word strawberry, it will give you a random answer, "think" about it, and give you another random answer. And I guarantee you out of 3 attempts, including reasoning, it will flip-flop between right and wrong, but unlike a human, it will be random, because, unlike humans who, when asked "how many r's are in the word strawberry" will not be able to tell you the correct answer every. fucking. time.

edit: formatting

show 2 replies