logoalt Hacker News

teaearlgraycoldlast Sunday at 10:21 PM7 repliesview on HN

The nice thing is unlike Cloudflare or AWS you can actually host good LLMs locally. I see a future where a non-trivial percentage of devs have an expensive workstation that runs all of the AI locally.


Replies

breatheoftenlast Sunday at 10:37 PM

I'm more and more convinced of the importance of this.

There is a very interesting thing happening right now where the "llm over promisers" are incentivized to over promise for all the normal reasons -- but ALSO to create the perception that the "next/soon" breakthrough is only going to be applicable when run on huge cloud infra such that running locally is never going to be all that useful ... I tend to think that will prove wildly wrong and that we will very soon arrive at a world where state of art LLM workloads should be expected to be massively more efficiently runnable than they currently are -- to the point of not even being the bottleneck of the workflows that use these components. Additionally these workloads will be viable to run locally on common current_year consumer level hardware ...

"llm is about to be general intelligence and sufficient llm can never run locally" is a highly highly temporary state that should soon be falsifiable imo. I don't think the llm part of the "ai computation" will be the perf bottleneck for long.

show 1 reply
PunchyHamsteryesterday at 1:26 AM

I'd imagine at some point the companies will just... stop publishing any open models precisely to stop that and keep people paying the subscription.

lxgrlast Sunday at 10:44 PM

I’m fairly sure you can also still run computers locally and connect them to the Internet.

show 1 reply
colordropslast Sunday at 10:42 PM

What's the best you can do hosting an LLM locally for under $X dollars. Let's say $5000. Is there a reference guide online for this? Is there a straight answer or does it depend? I've looked at Nvidia spark and high end professional GPUs but they all seem to have serious drawbacks.

show 1 reply
exe34last Sunday at 10:41 PM

I think it's possible, but the current trend is that by the time you can run x level at home, they have 10-100x in the frontier models, so if you can run today's Claude.ai at home, then software engineering as a career is already over.

show 1 reply
cftlast Sunday at 10:59 PM

That's the only future of open source that I can see.

szundilast Sunday at 10:29 PM

[dead]