logoalt Hacker News

kelipso08/08/20252 repliesview on HN

For $10k, you too can get the power of a $2k desktop, and enjoy burning your lap everyday, or something like that. If I were to do local compute and wanted to use my laptop, I would only consider a setup where I ssh in to my desktop. So I guess only difference from saas llm would be privacy and the cool factor. And rate limits, and paying more if you go over, etc.


Replies

com2kid08/08/2025

$2k laptops now days come with 16 cores. They are thermally limited, but they are going to get you 60-80% the perf of their desktop counterparts.

The real limit is on the Nvidia cards. They are cut down a fair bit, often with less VRAM until you really go up in price point.

They also come with NPUs but the docs are bad and none of the local LLM inference engines seem to use the NPU, even though they could in theory be happy running smaller models.

EagnaIonat08/09/2025

> For $10k, you too can get the power of a $2k desktop,

Even M1 MBP 32GB performance is pretty impressive for its age and you can get them for well <$1K second hand.

I have one.

I use these models: gpt-oss, llama3.2, deepseek, granite3.3

They all work fine and speed is not an issue. The recent Ollama app means I can have document/image processing with the LLM as well.