But isn’t training models, a forever task like iterating in tech you can never take a day off, adding humans to the equation don’t humans train/teach themselves new skills over a lifetime, and isn’t one of the selling points in the future when selling this AI slop your AI never goes to sleep and can always be trained forever? The AI price for entry as we go on into the future will only increase.
Just keeping it up to date with competitors is much cheaper, by copying better ones like Qwen did with Claude. Also a bunch of research is trickling into open source / arxiv so catching up should continue becoming cheaper at least as a fraction of training from scratch
I agree that training is a forever task, and the current rate of training is probably not sustainable. But all that means is that once the current investment mania ends, the market will most likely find a new equilibrium where continuous training still happens, but at a slower rate that can be sustained by inference revenue.