This is why you should have local models. The local models are good enough for private chats, they might not be as good as the cloud models for precise technical work, but for general sensitive chat you definitely should stick to local.
Yes, local for anything that can run locally. For higher-end model needs there are privacy platforms like Venice (https://venice.ai/privacy) with ZDR legal contracts and multiple E2EE options for their open-weight models. The OpenAI/Anthropic/Google models are also available through through them but at least your identity is anonymized, though the contents of your prompt could still be stored by the destination company.
Yes, local for anything that can run locally. For higher-end model needs there are privacy platforms like Venice (https://venice.ai/privacy) with ZDR legal contracts and multiple E2EE options for their open-weight models. The OpenAI/Anthropic/Google models are also available through through them but at least your identity is anonymized, though the contents of your prompt could still be stored by the destination company.