logoalt Hacker News

ekianjotoday at 6:11 AM1 replyview on HN

I guess the parallel is "Ollama serve" which provides you with a direct REST API to interact with a LLM.


Replies

sievetoday at 8:47 AM

llama-cpp provides an API server as well via llama-server (and a competent webgui too).