logoalt Hacker News

danmaz74last Friday at 10:49 AM6 repliesview on HN

> More importantly, Anthropic should have open sourced their Claude Code CLI a year ago. (They can and should just open source it now.)

"Should have" for what reason? I would be happy if they open sourced Claude Code, but the reality is that Claude Code is what makes Anthropic so relevant in the programming more, much more than the Claude models themselves. Asking them to give it away for free to their competitors seems a bit much.


Replies

mrbungielast Friday at 11:11 AM

Well OpenCode already exists and you can connect it to multiple providers, so you could just say that the agentic CLI harness business model as a service/billable feature is no more. In hindsight I would say it never made sense in the first place.

show 5 replies
tom_myesterday at 2:46 AM

Nah, I think Opus is fantastic but not Claude Code. Their models are way better.

udntwanthesmokyesterday at 10:46 AM

I have the rehydrated version, should I publish it

reactordevlast Friday at 12:22 PM

Claude code is nothing more than a loop to Opus.

show 1 reply
altmanaltmanlast Friday at 2:12 PM

> the reality is that Claude Code is what makes Anthropic so relevant in the programming more, much more than the Claude models themselves

but Claude Code cannot run without Claude models? What do you mean?

show 3 replies
giancarlostorolast Friday at 2:24 PM

Yeah, I've heard of people swapping out the model that Claude Code calls and apparently its not THAT much of a difference. What I'd love to see from Anthropic instead is, give me smaller LLM models, I don't even care if they're "open source" or not, just pull down a model that takes maybe 4 or 6 GB of VRAM into my local box, and use those for your coding agents, you can direct it and guide it with Opus anyway, so why not cut down on costs for everyone (consumer and Anthropic themselves!) by just letting users who can run some of the compute locally. I've got about 16GB of VRAM I can juice out of my Macbook Pro, I'm okay running a few smaller models locally with the guiding hand of Opus or Sonnet for less compute on the API front.

show 3 replies