logoalt Hacker News

deadbabelast Thursday at 6:54 PM4 repliesview on HN

You’re still anthropomorphizing what these models are doing.


Replies

mossTechnicianlast Thursday at 6:58 PM

I've come to the same conclusion. "AI" was just the marketing term for a large language model in the form of a chatbot, which harkened to sci-fi characters like Data or GLaDOS. It can look impressive, it can often give correct answers, but it's just a bunch of next word predictions stacked on top of each other. The word "AI" has deviated so much from this older meaning that a second acronym, "AGI", had to be created to represent what "AI" once did.

The new "reasoning" or "chain of thought" AIs are similarly just a bunch of conventional LLM inputs and outputs stacked on top of each other. I agree with the GP that it feels a bit magical at first, but the opportunity to run a DeepSeek distillation on my PC - where each step of the process is visible - removed quite a bit of the magic behind the curtain.

show 9 replies
kvakeroklast Thursday at 7:01 PM

> You’re still anthropomorphizing what these models are doing.

Didn't we build them to imitate humans? They're anthropomorphic by definition.

show 1 reply
jmuganlast Thursday at 6:54 PM

It's just shorthand.

alanbernsteinlast Thursday at 6:59 PM

Would you prefer if we started using words like aiThinking and aiReasoning to differentiate? Or is it reasonable to figure it out from context?

show 1 reply