logoalt Hacker News

willmaddenyesterday at 7:38 PM2 repliesview on HN

No one is looking at this issue correctly. Saying out of the AI "talent war" is a smart move. AI is due to collapse under its own weight.

1) High-quality training data is effectively exhausted. The next 10× scale model would need 10× more tokens than exist.

2) The Chinchilla rule. Hardware gets 2× cheaper every 18 mo, but model budgets rise 4× in that span. Every flagship LLM therefore costs 2× more than the last, while knock-off models appear years later for pennies. Benchmark gains shrink and regulation piles on. Net result: each new dollar on the next big LLM now buys far less payoff. The "wait-and-copy" option is getting cheaper every day.


Replies

pvtmertyesterday at 11:11 PM

I usually do not agree with the Amazon leadership (well, recently they haven't been "Right A lot"!)

But I agree with the following statement Matt Garman gave recently;

    Amazon Web Services CEO Matt Garman said that using AI tools in place of junior employees was "one of the dumbest things I've ever heard" because these employees are "the least expensive" and "the most leaned into your AI tools."
It's because AI usually creates slop, without review these "slop" build up. We don't have infinite context window to solve the slop anyway. (even if we do, the context-rot has been confirmed)

Also, on average, Indian non-Tech employees who manages thousands of spreadsheets or manually manages your in-store cameras are much more cheaper than the "tokens" and the NVIDIA GPUs you can throw at the problem, at least for now and a foreseeable future.

show 1 reply
liquidpeleyesterday at 8:03 PM

I agree but it doesn’t seem to be intentional on their part.