logoalt Hacker News

Maxionlast Thursday at 2:09 PM5 repliesview on HN

If LLMs turn out to be such a force multiplier, the way to fight it is to ensure that there are open source LLMs.


Replies

captainblandlast Thursday at 2:36 PM

I think the issue is that LLMs are a cash problem as much as they are a technical problem. Consumer hardware architectures are still pretty unfriendly to running models which are actually competitive to useful models so if you want to even do inference on a model that's going to reliably give you decent results you're basically in enterprise territory. Unless you want to do it really slowly.

The issue that I see is that Nvidia etc. are incentivised to perpetuate that so the open source community gets the table scraps of distills, fine-tunes etc.

show 2 replies
nunezlast Thursday at 6:49 PM

Open-source models will never be _truly_ competitive as long as obtaining quality datasets and training on them remains prohibitively expensive.

Plus, most users don't want to host their own models. Most users don't care that OpenAI, Anthropic and Google have a monopoly on LLMs. ChatGPT is a household name, and most of the big businesses are forcing Copilot and/or Claude onto their employees for "real work."

This is "everyone will have an email server/web server/Diaspora node/lemmy instance/Mastodon server" all over again.

show 1 reply
fnordpigletlast Thursday at 2:29 PM

The problem is even if an OSS had the resources (massive data centers the size of NYC packed with top end custom GPU kits) to produce the weights, you need enormous VRAM laden farms of GPUs to do inference on a model like Opus 4.6. Unless the very math of frontier LLMs changes, don’t expect frontier OSS on par to be practical.

show 3 replies
runarberglast Thursday at 2:24 PM

That would be accepting the framing of your class enemy, there is no reason to do that.

metalliqazlast Thursday at 2:26 PM

unless they are also pirate LLMs, I don't see how any open source project could have pockets deep enough for the datacenters needed to seriously contend