logoalt Hacker News

roygbiv2today at 9:07 AM3 repliesview on HN

And how much does the hardware cost to run said models?


Replies

Lerctoday at 1:22 PM

It can be quite expensive to get the models and machines to do this.

That's what the money pays for when the Comment above mentions 'that you might have to eventually pay an AI company a large amount of money to ask ChatGPT such a question'

Putting aside that it won't be a large amount of money For any particular query , that's how the AI companies see themselves, not as providers of information, but as providers of mechanisms that provide information. It is not selling the Information of others, it isn't selling information at all. They are selling the service of running the mechanism.

dborehamtoday at 9:45 AM

You can run them slowly on any machine that has enough memory.

show 1 reply
fragmedetoday at 9:51 AM

How good do you want it to be? For a close to ChatGPT today (April, 2026), you're still looking at a system with 7xH200+chassis, which will run you $300, or a GB200 NV72, which is $2-3 million. OTOH, a Qwen3.6 quantized model can be run on $10,000 (high end Mac) or $1,000 (Mac mini) worth of hardware. Even a Pixel 10 Pro cellphone ($1,000) can run useful models locally.

show 1 reply