logoalt Hacker News

mehdibltoday at 3:28 AM3 repliesview on HN

Ollama is quite a bad example here. Despite popular, it's a simple wrapper and more and more pushed by the app it wraps llama.cpp.

Don't understand here the parallel.


Replies

kossisoroycetoday at 9:44 AM

TBVH I didn't think about naming it too much. I defaulted to Ollama because of the perceive simplicity and I wanted that same perceived simplicity to help adoption.

eleventyseventoday at 8:47 AM

This is the vLLM of classic ML, not Ollama.

ekianjotoday at 6:11 AM

I guess the parallel is "Ollama serve" which provides you with a direct REST API to interact with a LLM.

show 1 reply