> Changing your LLM inference provider is the easiest switch in technology I can think of.
Thats true right up until you’re working with confidential info in a corporate context. Then it’s a multi month cross discipline cross jurisdiction project not an edit in a config file.
L O C A L M O D E L S
All data stays on computers that you control.
Same API. Localhost.