logoalt Hacker News

LoganDarktoday at 12:48 PM1 replyview on HN

I think what they're saying is the methods used today are faster but have a lower ceiling, and that that's why they quickly took over but can only go so far.


Replies

jerftoday at 1:32 PM

That would be a hypothesis, not a fact.

I'm not closed to it. You can check my comment history for frequent references to next-generation AIs that aren't architected like LLMs. But they're going to have to produce an AI of some sort that is better than the current ones, not hypothesize that it may be possible. We've got about 50 years of hypothesis about how wonderful such techniques may be and, by the new standards of 2026, precious few demonstrations of it.

Quoting from the article:

"Within five years, deep learning had consumed machine learning almost entirely. Not because the methods it displaced had stopped working, but because the money, the talent, and the prestige had moved elsewhere."

That one jumped right out at me because there's a slight-of-hand there. A more correct quote would be "Not because the methods it displaced had stopped working as well as they ever have, ..." Without that phrase, the implication that other techniques were doing just as well as our transformer-based LLMs is slipped in there, but it's manifestly false when brought up to conscious examination. Of course they haven't, unless they're in the form of some probably-beyond-top-secret AI in some government lab somewhere. Decades have been poured into them and they have not produced high-quality AIs.

Anyone who wants to produce that next-gen leap had probably better have some clear eyes about what the competition is.

show 1 reply