logoalt Hacker News

simonwyesterday at 3:49 AM2 repliesview on HN

It's rare to find a local model that's capable of running tools in a loop well enough to power a coding agent.

I don't think gpt-oss:20b is strong enough to be honest, but 120b can do an OK job.

Nowhere NEAR as good as the big hosted models though.


Replies

ontouchstartyesterday at 5:03 AM

Think of it as the early years of UNIX & PC. Running inferences and tools locally and offline opens doors to new industries. We might not even need client/server paradigm locally. LLM is just a probabilistic library we can call.

AlexCoventryyesterday at 5:17 AM

Thanks.