logoalt Hacker News

bgwalterlast Sunday at 4:38 PM1 replyview on HN

Because you have Cloudflare (MITM 1), Openrouter (MITM 2) and finally the "AI" provider who can all read, store, analyze and resell your queries.

EDIT: Thanks for downvoting what is literally one of the most important reasons for people to use local models. Denying and censoring reality does not prevent the bubble from bursting.


Replies

irthomasthomaslast Monday at 12:22 AM

you can use chutes.ai TEE (Trusted Execution Environment) and Kimi K2 is running at about 100t/s rn