logoalt Hacker News

max-privatevoidyesterday at 7:45 PM1 replyview on HN

The online documentation does not suggest that using a generic OpenAI-compatible server is an option, and it once again lists the non-local option first.

https://atomicapp.ai/getting-started/ai-providers/

> OpenAI-compatible is indeed one of the provider options for Atomic. Ollama and openRouter are separate options to allow for easier selection of models from these specific providers.

Why is this necessary over just presenting the result of `/v1/models`?

You can say it's just the ordering of a dropdown, but to me it seems pretty clear that this thing is developed with the idea that you'll most likely use a SaaS provider.


Replies

kenforthewinyesterday at 8:04 PM

It has supported local LLMs from the beginning, it was not something that was just tacked on. I don't know what else to tell you. Your assumptions are just wrong.