logoalt Hacker News

hypercube33yesterday at 4:46 PM2 repliesview on HN

All we need is something like Qwen3-coder-next but at Kimi K2.6 ability so it runs on laptop workstation hardware and we are set...soon?


Replies

wolttamyesterday at 4:58 PM

In 2023 GPT-4 was allegedly 1.8T parameters. In 2026 we have ~100x smaller models (10-20B) that handily outperform it, and can indeed run on a laptop.

show 2 replies
unshavedyakyesterday at 5:42 PM

I am eagerly awaiting being able to run a strong local model. I'd hand Apple $5k right now for a Claude in a box. I know the cost might not be there now, just saying that is around my ideal price point.

$10k might even be worth it - but i'm assuming that the more expensive it is the beefier it is too, which also means more electricity.. and i already run ~6 computers/servers in my house. If a power surge happens i'm going to go live in the woods lol.

show 3 replies