logoalt Hacker News

applesauce004today at 3:07 AM4 repliesview on HN

Can someone explain to me why this needs to connect to LLM providers like OpenAI or Anthropic? I thought it was meant to be a local GPT. Sorry if i misunderstood what this project is trying to do.

Does this mean the inference is remote and only context is local?


Replies

atmanactivetoday at 3:17 AM

It doesn't. It has to connect to SOME LLM provider, but that CAN also be local Ollama server (running instance). The choice ALWAYS need to be present since, depending on your use case, Ollama (local machine LLM) could be just right, or it could be completely unusable, in which case you can always switch to data center size LLMs.

The ReadMe gives only a Antropic version example, but, judging by the source code [1], you can use other providers, including Ollama, just by changing the syntax of that one config file line.

[1] https://github.com/localgpt-app/localgpt/blob/main/src%2Fage...

schobitoday at 6:49 AM

I applaud the effort of tinkering, re-creating and sharing, but I think the name is misleading - it is not at all a "local GPT". The contribution is not to do anything local and it is not a GPT model.

It is more like an OpenClaw rusty clone

vgb2k18today at 3:16 AM

If local isn't configured then fallback to online providers:

https://github.com/localgpt-app/localgpt/blob/main/src%2Fage...

halJordantoday at 3:13 AM

It doesn't need to