logoalt Hacker News

giancarlostorotoday at 3:48 PM1 replyview on HN

Not sure about LocalLlama, but have you tried LMStudio? If you use Zed it will auto-pickup whatever model you enable on LMStudio. I keep meaning to write a blog post about this for people unaware that you can pair the two pretty easily on a Mac. I mostly use CC but like to test offline models now and then to see how far they've come along.


Replies

ridiculous_leketoday at 4:10 PM

I use Ollama to run local models but have never used one with CC. Curious what models work best for people.

show 1 reply