logoalt Hacker News

throw310822yesterday at 9:47 PM1 replyview on HN

No I don't think so. We can already create LLMs that are highly efficient and infinitely more knowledgeable than any single human being, completely tuned to the task, without ego or distractions, and they are cheap enough that you can run tens of them in parallel for a few hundred dollars per month. They are also way faster than any human being. And we're three/ four years in this. Imagine 50 years from now.


Replies

gambitingyesterday at 10:16 PM

>>Imagine 50 years from now.

That's the whole point though - I can't, and I don't think anyone can. Right now the LLMs are just getting bigger and bigger, we're bruteforcing the way out of their stupidity by giving them bigger and bigger datasets - unless something fundamental changes soon that tech has an actual dead end. Hence my (joke-ish) prediction that you'll eventually need a 16PB GPU to run a basic gemini model, and such a thing will always be very expensive no matter how much our tech advances(especially since we are already hitting some technical limits). Human brains won't get any more expensive with time - they already contain all the hardware they are ever going to get - but what might get cheaper is the plumbing to make them "run" and interact with other systems.

show 1 reply