logoalt Hacker News

petesergeantyesterday at 10:02 AM0 repliesview on HN

> I've done the research

Great. What's the 1970s equivalent of word2vec or embeddings, that we've simply scaled up? Where are the papers about the transformer architecture or attention from the 1970s? Sure feels like you think LLMs are just big perceptrons.

> The problem with the models is they require exponential cost growth

Let's stick to the assertion I was disputing instead.