logoalt Hacker News

b112last Sunday at 6:09 AM1 replyview on HN

They're definitely not subsidizing API pricing

The cost of a thing, is relative to its source costs. They are subsidizing API pricing, if you consider all the costs to provide the service, including all model creation, training, etc costs.

But that doesn't mean they will be more expensive, longer term. The cost of compute will go down as time goes on. Each year it will get cheaper. Same for power requirements, computing density, cooling, and so on.

I remember trying to store and play mp3 files on older computers. I could typically hold a few on a disk, and if I wasn't doing anything else I could play one. Barely. Now you'll be hard pressed to play an mp3 and see the load results in top or what not.

The same will be true of AI in 20 years.


Replies

deadbabelast Sunday at 4:17 PM

If those cost of compute is going down, then eventually it will go down enough that we will run on our LLMs locally and Anthropic will go out of business.

show 2 replies