logoalt Hacker News

eruyesterday at 7:30 PM0 repliesview on HN

> Foundation-wise, observed improvements are incremental, not exponential.

Incremental gains are fine. I suspect capability of models scales roughly as the logarithm of their training effort.

> (read: drinking water and energy)

Water is not much of a concern in most of the world. And you can cool without using water, if you need to. (And it doesn't have to be drinking water anyway.)

Yes, energy is a limiting factor. But the big sink is in training. And we are still getting more energy efficient. At least to reach any given capability level; of course in total we will be spending more and more energy to reach ever higher levels.