logoalt Hacker News

whatever1today at 4:10 AM7 repliesview on HN

The evidence shows that there is no methodological moat for LLMS. The moat of the frontier folks is just compute. xAI went in months from nothing to competing with the top dogs. DeepSeek too. So why bother with splurging billions in talent when you can buy GPUs and energy instead and serve the compute needs of everyone?

Also Amazon is in another capital intensive business. Retail. Spending billions on dubious AWS moonshots vs just buying more widgets and placing them across the houses of US customers for even faster deliveries does not make sense.


Replies

cedwstoday at 7:40 AM

A lot of C-suite people seem to have an idea that if they just throw enough compute at LLMs that AGI will eventually emerge, even though it's pretty clear at this point that LLMs are never going to lead to general intelligence. In their view it makes sense to invest massive amounts of capital because it's like a lottery ticket to being the future AGI company that dominates the world.

I recall Zuckerberg saying something about how there were early signs of AI "improving itself." I don't know what he was talking about but if he really believes that's true and that we're at the bottom of an exponential curve then Meta's rabid hiring and datacenter buildout makes sense.

show 5 replies
abtinftoday at 8:04 AM

The idea that models are copyrightable is also extremely dubious.

So there probably isn’t even a legal moat.

Lyapunov_Lovertoday at 7:38 AM

> The evidence shows that there is no methodological moat for LLMS.

Does it? Then how come Meta hasn't been able to release a SOTA model? It's not for a lack of trying. Or compute. And it's not like DeepSeek had access to vastly more compute than other Chinese AI companies. Alibaba and Baidu have been working on AI for a long time and have way more money and compute, but they haven't been able to do what DeepSeek did.

show 1 reply
bhltoday at 6:44 AM

The moat is people, data, and compute in that order.

It’s not just compute. That has mostly plateaued. What matters now is quality of data and what type of experiments to run, which environments to build.

show 1 reply
karterktoday at 5:06 AM

> The moat of the frontier folks is just compute.

This is not really true. Google has all the compute but in many dimensions they lag behind GPT-5 class (catching up, but it has not been a given).

Amazon itself did try to train a model (so did Meta) and had limited success.

show 8 replies
jojobastoday at 5:15 AM

Amazon retail runs on ridiculously low margins compared to AWS. Revenue-wise retail dwarfs AWS, profit-wise it's vice-versa.

VirusNewbietoday at 6:36 AM

Are you arguing anthropic has more compute than Amazon?

Are you saying the only reason Meta is behind everyone else is compute????

show 2 replies