logoalt Hacker News

vunderbayesterday at 10:22 PM1 replyview on HN

FWIW, Ollama already does most of this:

- Cross-platform

- Sets up a local API server

The tradeoff is a somewhat higher learning curve, since you need to manually browse the model library and choose the model/quantization that best fit your workflow and hardware. OTOH, it's also open-source unlike LMStudio which is proprietary.


Replies

randallsquaredyesterday at 10:50 PM

I assumed from the name that it only ran llama-derived models, rather than whatever is available at huggingface. Is that not the case?

show 1 reply