can you recommend a setup with ollama and a cli tool? Do you know if I need a licence for Claude if I only use my own local LLM?
we recently added a `launch` command to Ollama, so you can set up tools like Claude Code easily: https://ollama.com/blog/launch
tldr; `ollama launch claude`
glm-4.7-flash is a nice local model for this sort of thing if you have a machine that can run it
What are your needs/constraints (hardware constraints definitely a big one)?
The one I mentioned called continue.dev [1] is easy to try out and see if it meets your needs.
Hitting local models with it should be very easy (it calls APIs at a specific port)
[1] - https://github.com/continuedev/continue