logoalt Hacker News

erutoday at 7:49 AM1 replyview on HN

> [...] if I had to guess it's break even as the compute is most likely going idle otherwise.

Why would it go idle? It would go to their next best use. At least they could help with model training or let their researchers run experiments etc.


Replies

himata4113today at 7:51 AM

inference compute is vastly different versus training, also it has to stay hot in vram which probably takes up most of it. There is limited use for THAT much compute as well, they are running things like claude code compiler and even then they're scratching the surface of the amount of compute they have.

Training currently requires nvidia's latest and greatest for the best models (they also use google TPU's now which are also technically the latest and greatest? However, they're more of a dual purpose than anything afaik so that would be a correct assesment in that case)

Inference can run on a hot potato if you really put your mind to it

show 2 replies