Constraints can lead to innovation. Just two things that I think will get dramatically better now that companies have incentive to focus on them:
* harness design
* small models (both local and not)
I think there is tremendous low hanging fruit in both areas still.
Harness is a big one, Claude Code still has trouble editing files with tabs. I wonder how many tokens per day are wasted on Claude attempting multiple times to edit a file.
Yep.
As a recent example in AI space itself. China had scarce GPU resources, quite obvious why => DeepSeek training team had to invent some wheels and jump through some hoops => some of those methods have since become 'industry standard' and adopted by western labs who are now jumping through the same hoops despite enjoying massive computeresources, for the sake of added efficiency.
Absolutely. Anyone working on inference token level knows how wasteful it all is especially in multimodal tokens.
Could not agree more, this will spur innovation in all aspects of local models is my hunch.
[dead]
China already operates like this. Low cost specialized models are the name of the game. Cheaper to train, easy to deploy.
The US has a problem of too much money leading to wasteful spending.
If we go back to the 80s/90s, remember OS/2 vs Windows. OS/2 had more resources, more money behind it, more developers, and they built a bigger system that took more resources to run.
Mac vs Lisa. Mac team had constraints, Lisa team didn't.
Unlimited budgets are dangerous.