logoalt Hacker News

heresie-dabordlast Monday at 6:50 PM1 replyview on HN

> a midpoint between "AIs are useless and do not actually think" and "AIs think like humans"

LLMs (AIs) are not useless. But they do not actually think. What is trivially true is that they do not actually need to think. (As far as the Turing Test, Eliza patients, and VC investors are concerned, the point has been proven.)

If the technology is helping us write text and code, it is by definition useful.

> In 2003, the machine-learning researcher Eric B. Baum published a book called “What Is Thought?” [...] The gist of Baum’s argument is that understanding is compression, and compression is understanding.

This is incomplete. Compression is optimisation, optimisation may resemble understanding, but understanding is being able to verify that a proposition (compressed rule or assertion) is true or false or even computable.

> —but, in my view, this is the very reason these models have become increasingly intelligent.

They have not become more intelligent. The training process may improve, the vetting of the data improved, the performance may improve, but the resemblance to understanding only occurs when the answers are provably correct. In this sense, these tools work in support of (are therefore part of) human thinking.

The Stochastic Parrot is not dead, it's just making you think it is pining for the fjords.


Replies

crazygringolast Monday at 7:24 PM

> But they do not actually think.

I'm so baffled when I see this being blindly asserted.

With the reasoning models, you can literally watch their thought process. You can see them pattern-match to determine a strategy to attack a problem, go through it piece-by-piece, revisit assumptions, reformulate strategy, and then consolidate findings to produce a final result.

If that's not thinking, I literally don't know what is. It's the same process I watch my own brain use to figure something out.

So I have to ask you: when you claim they don't think -- what are you basing this on? What, for you, is involved in thinking that the kind of process I've just described is missing? Because I genuinely don't know what needs to be added here for it to become "thinking".

show 2 replies