logoalt Hacker News

Workaccount2today at 2:58 AM1 replyview on HN

Never, local models are for hobby and (extreme) privacy concerns.

A less paranoid and much more economically efficient approach would be to just lease a server and run the models on that.


Replies

g947otoday at 5:23 AM

This.

I spent quite some time on r/LocalLLaMA and yet need to see a convincing "success story" of productively using local models to replace GPT/Claude etc.

show 1 reply