logoalt Hacker News

chpatrickyesterday at 5:11 PM1 replyview on HN

If a human uses their general knowledge of electronics to answer a specific question they haven't seen before that's obviously thinking. I don't see why LLMs are held to a different standard. It's obviously not repeating an existing answer verbatim because that doesn't exist in my case.

You're saying it's nothing "special" but we're not discussing whether it's special, but whether it can be considered thinking.


Replies

stinostoday at 9:43 AM

it's obviously not repeating an existing answer verbatim

Not verbatim in the sense that the words are different doesn't make it thinking. Also when we say 'humans think' that means a lot more than only 'new question generates correct answer' or 'smart autocompletion'. See a lot of other comments here for details.

But again: I laid out 2 possibilities explaining why the question might in fact not be new, nor the data, so I'm curious which of the 2 (or another) explains the situation you're talking about.

You're saying it's nothing "special" but we're not discussing whether it's special, but whether it can be considered thinking.

Apologies, with 'special' I did in fact mean 'thinking'

show 1 reply