logoalt Hacker News

halJordantoday at 3:12 AM1 replyview on HN

You absolutely do not have to use a third party llm. You can point it to any openai/anthropic compatible endpoint. It can even be on localhost.


Replies

dvttoday at 3:21 AM

Ah true, missed that! Still a bit cumbersome & lazy imo, I'm a fan of just shipping with that capability out-of-the-box (Huggingface's Candle is fantastic for downloading/syncing/running models locally).

show 2 replies