logoalt Hacker News

ridiculous_leketoday at 3:46 PM1 replyview on HN

Can people drop a good LocalLlama setup that I can run on M4?


Replies

giancarlostorotoday at 3:48 PM

Not sure about LocalLlama, but have you tried LMStudio? If you use Zed it will auto-pickup whatever model you enable on LMStudio. I keep meaning to write a blog post about this for people unaware that you can pair the two pretty easily on a Mac. I mostly use CC but like to test offline models now and then to see how far they've come along.

show 1 reply