logoalt Hacker News

drcongo07/31/20252 repliesview on HN

This looks pretty neat. Just spotted in the docs that it has an MCP server too, however, I haven't found anything in the docs about using a locally hosted model. Running this on a box in the corner of the office would be great, but external AI providers would be a deal breaker.


Replies

bshzzle08/05/2025

Hey just following up on this - we just shipped support for any model that supports the OpenAI Chat Completions API (1), including Ollama and Llama.cpp. You can checkout the docs here: https://docs.sourcebot.dev/docs/configuration/language-model...

[1] https://platform.openai.com/docs/api-reference/chat

bshzzle07/31/2025

Running Sourcebot with a self-hosted LLM is something we plan to support and have documented in the golden path very soon, so stay tuned.

We are using the Vercel AI SDK which supports Ollama via a community provider, but doesn't V5 yet (which Sourcebot is on): https://v5.ai-sdk.dev/providers/community-providers/ollama