No one is looking at this issue correctly. Saying out of the AI "talent war" is a smart move. AI is due to collapse under its own weight.
1) High-quality training data is effectively exhausted. The next 10× scale model would need 10× more tokens than exist.
2) The Chinchilla rule. Hardware gets 2× cheaper every 18 mo, but model budgets rise 4× in that span. Every flagship LLM therefore costs 2× more than the last, while knock-off models appear years later for pennies. Benchmark gains shrink and regulation piles on. Net result: each new dollar on the next big LLM now buys far less payoff. The "wait-and-copy" option is getting cheaper every day.
It is interesting but I think they're doing the right thing. AWS bedrock works pretty well and you can access frontier models plus everything open source on it. In the end, I was disbelieving that Graviton would be good but the latest r8g series are great for compute so I imagine GPU compute will similarly be mastered by them in time.
I had totally forgotten that I signed up for the Kiro waitlist. It seems Amazon has also totally sat out the interest in their AI offering.
Has anyone had a chance to use Kiro at all? At this point I'm not even interested in it anymore, even if I got an invite.
The top researchers published enough details on how to build what works well. Amazon can copy what's useful. They'll probably do it in a way that makes profit, too. Neither talent wars nor AI, startup models contribute to that.
Um, Amazon has invested $8B in Anthropic.
I think they learned some hard lessons from Alexa.
So what's the tldr, are they just too cheap too pay for top AI scientist talent-- which is imagine they would need in order to enter the fray?
> "GenAI hiring faces challenges like location
No! Really? With RTO? Unbelievable /s
[dead]
[flagged]
I've been having great success with bottles (https://usebottles.com/). The fact that you can have multiple tweakable prefix helps a lot.
I find AWS extremely difficult to use compared to GCP. Even though we received startup credits—which are essentially free money—we’re letting them go to waste because the platform is so much harder to work with.
It’s no surprise that AWS’s revenue growth is lagging behind GCP and Azure.
Beyond the AI talent gap, Amazon seems to be making serious missteps in its own core business.
It reminds me of Apple. At first, people thought Apple was being strategic by staying out of the AI race and waiting to pick the winner. But in reality, it turned out to be an inability to adapt to the new trend. I expect the same pattern from Amazon.
Amazon just has to host LLaMa and Qwen locally, just as they do so many other packages developed by others, and charge for their AWS compute credits. Why do they need “AI talent”?