logoalt Hacker News

simonwtoday at 1:30 AM3 repliesview on HN

I think they mean that the DeepSeek API charges are less than it would cost for the electricity to run a local model.

Local model enthusiasts often assume that running locally is more energy efficient than running in a data center, but fail to take the economies of scale into account.


Replies

croestoday at 5:28 AM

Local enthusiasts don’t have to fear account banning.

jacquesmtoday at 3:50 AM

Some of those local model enthusiasts can actually afford solar panels.

show 1 reply
littlestymaartoday at 3:15 AM

I guess it mostly comes from using the model with batch-size = 1 locally, vs high batch size in a DC, since GPU consumption don't grow that much with batch size.

Note that while a local chatbot user will mostly be using batch-size = 1, it's not going to be true if they are running an agentic framework, so the gap is going to narrow or even reverse.