logoalt Hacker News

patrickkyesterday at 4:27 PM1 replyview on HN

In parallel, local models are getting better and better, so eventually they’ll get “good enough” to run fairly cheaply at a level close to the current Sonnet/Opus models (what I run Claudeclaw with), on Groq, Openrouter or whatever commodity provider. Perhaps even mid to high end consumer PCs when the current RAM madness subsides.

There’s loads of good discussions about local LLMs in this thread:

https://news.ycombinator.com/item?id=47190997


Replies