logoalt Hacker News

mogomanyesterday at 8:40 PM2 repliesview on HN

can you recommend a setup with ollama and a cli tool? Do you know if I need a licence for Claude if I only use my own local LLM?


Replies

alexhansyesterday at 8:57 PM

What are your needs/constraints (hardware constraints definitely a big one)?

The one I mentioned called continue.dev [1] is easy to try out and see if it meets your needs.

Hitting local models with it should be very easy (it calls APIs at a specific port)

[1] - https://github.com/continuedev/continue

show 2 replies
drifkinyesterday at 9:42 PM

we recently added a `launch` command to Ollama, so you can set up tools like Claude Code easily: https://ollama.com/blog/launch

tldr; `ollama launch claude`

glm-4.7-flash is a nice local model for this sort of thing if you have a machine that can run it

show 1 reply