Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.
https://github.com/chr15m/runprompt/blob/main/runprompt#L9
seems like it would be, just swap the openai url here or add a new one
Good idea. Will figure out a way to do this.
https://github.com/chr15m/runprompt/blob/main/runprompt#L9
seems like it would be, just swap the openai url here or add a new one