logoalt Hacker News

sulamyesterday at 6:54 PM4 repliesview on HN

The reality is that current models are simply nowhere near AGI. Next token prediction has been pushed very far, and proven to have applicability far beyond the original domain it was designed for (reasoning models are an application I would not have predicted) but it is fundamentally not AGI. It has no real world model, no ability to learn in any but superficial ways, and without extensive scaffolding this is all very obvious when you use them.


Replies

onlyrealcuzzoyesterday at 11:12 PM

How many months has it been since we were told there would be zero software engineers left in the world in 12 months?

ACCount37yesterday at 9:22 PM

Given the mechanistic interpretability findings? I'm not sure how people still say shit like "no real world model" seriously.

show 3 replies
hintymadtoday at 1:36 AM

> It has no real world model, no ability to learn in any but superficial ways

I also think so, and in the meantime I have to admit a lot of people don't learn deeply either. Take math for example, how many STEM students from elite universities truly understood the definition of limit, let alone calculus beyond simple calculation? Or how many data scientists can really intuitively understand Bayesian statistics? Yet millions of them were doing their job in a kinda fine way with the help of the stackexchange family and now with the help of AI.

show 1 reply