logoalt Hacker News

timewizardyesterday at 9:05 AM1 replyview on HN

> you’ve got some really exciting research and learning ahead of you

I've done the research. Which is why I made the point I did. You're being dismissive and rude instead of putting forth any sort of argument. It's the paper hat of fake intellect. Yawn.

> At the moment it reads very “computers are just addition and multiplication and we’ve had that for thousands of years!”

Let's be specific then. The problem with the models is they require exponential cost growth for model generation giving only linear increases in output performance. This cost curve is currently a factor or two stronger than the curve of increasing hardware performance. Putting the technology, absent any actual fundamental algorithmic improvements, which do /not/ seem forthcoming despite billions in speculative funding, into a strict coffin corner. In short: AI winter 2.0.

Got any plans for that? Any specific research that deals with that? Any thoughts of your own on this matter?


Replies

petesergeantyesterday at 10:02 AM

> I've done the research

Great. What's the 1970s equivalent of word2vec or embeddings, that we've simply scaled up? Where are the papers about the transformer architecture or attention from the 1970s? Sure feels like you think LLMs are just big perceptrons.

> The problem with the models is they require exponential cost growth

Let's stick to the assertion I was disputing instead.