> but I'm not really sure about calling it "local-first" as it's still reliant on an `ANTHROPIC_API_KEY`.
See here:
https://github.com/localgpt-app/localgpt/blob/main/src%2Fage...
What reasonable comparable model can be run locally on say 16GB of video memory compared to Opus 4.6? As far as I know Kimi (while good) needs serious GPUs GTX 6000 Ada minimum. More likely H100 or H200.
What reasonable comparable model can be run locally on say 16GB of video memory compared to Opus 4.6? As far as I know Kimi (while good) needs serious GPUs GTX 6000 Ada minimum. More likely H100 or H200.