logoalt Hacker News

deadbabe02/20/20254 repliesview on HN

You’re still anthropomorphizing what these models are doing.


Replies

mossTechnician02/20/2025

I've come to the same conclusion. "AI" was just the marketing term for a large language model in the form of a chatbot, which harkened to sci-fi characters like Data or GLaDOS. It can look impressive, it can often give correct answers, but it's just a bunch of next word predictions stacked on top of each other. The word "AI" has deviated so much from this older meaning that a second acronym, "AGI", had to be created to represent what "AI" once did.

The new "reasoning" or "chain of thought" AIs are similarly just a bunch of conventional LLM inputs and outputs stacked on top of each other. I agree with the GP that it feels a bit magical at first, but the opportunity to run a DeepSeek distillation on my PC - where each step of the process is visible - removed quite a bit of the magic behind the curtain.

show 9 replies
kvakerok02/20/2025

> You’re still anthropomorphizing what these models are doing.

Didn't we build them to imitate humans? They're anthropomorphic by definition.

show 1 reply
jmugan02/20/2025

It's just shorthand.

alanbernstein02/20/2025

Would you prefer if we started using words like aiThinking and aiReasoning to differentiate? Or is it reasonable to figure it out from context?

show 1 reply