logoalt Hacker News

poszlemtoday at 12:06 AM4 repliesview on HN

Between this, and whatever Claude has been doing lately, like giving the AI the ability to just disconnect if it dislikes your prompt, I really hope more people realize that local LLMs are where it's at.


Replies

nacstoday at 12:28 AM

> I really hope more people realize that local LLMs are where it's at

No worries, the AI companites thought ahead - by sending GPU, RAM, and now even harddrive prices through the roof, you won't have a computer to run a local model.

usef-today at 1:18 AM

Have you hit that? I thought it was only in extreme cases when Claude felt uncomfortable, like awful heavy psychological coercion. They wanted Claude not to be forced to reply endlessly.

bakugotoday at 12:37 AM

> I really hope more people realize that local LLMs are where it's at.

Maybe if you have the tens of thousands worth of hardware required to run models like DeepSeek, GLM or Kimi locally. Most people don't, though.

show 1 reply