logoalt Hacker News

stuaxotoday at 3:22 PM2 repliesview on HN

If I can use this with a local LLM it could be useful.


Replies

kay_otoday at 6:05 PM

In ollama is included default add the endpoint URL yourself

zbentleytoday at 3:32 PM

Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.