logoalt Hacker News

miroljubtoday at 3:06 PM1 replyview on HN

> Do you have a reason to believe the gap between AI (not LLMs specifically, but AI generally) and human intelligence will peak near the difference between human competitors (what... 20-30 IQ points)?

So we established that competing human civilizations differ by 20-30 IQ points? Sounds reasonable.

> If so, please share why you believe this.

Basically two reasons:

1. there's no AI. There are LLMs, which basically do pattern matching on increasingly LLM generated data set. That inevitable leads to a local maximum where every advance is increasingly difficult for decreasing gain in "intelligence"

2. the energy required to reach an ever increasing level of "intelligence" (or let us just call it pattern matching performance) quickly becomes so huge that it's simply not sustainable.

I think the current LLM approach is a dead-end bound to plateau not much higher than the current level.

I'm not saying it's impossible to reach AI, but it would require a paradigm shift that I'm not even able to imagine at this level of available technology.


Replies

estearumtoday at 4:01 PM

> there's no AI. There are LLMs

Obviously AI is physically possible, unless you think there's something universally special about the earthbound naked ape's brain-goo that imbues it with special intelligence-stuff.

> the energy required to reach an ever increasing level of "intelligence" (or let us just call it pattern matching performance) quickly becomes so huge that it's simply not sustainable.

Every single human being has an existence (dis)proof inside their skull

> I think the current LLM approach is a dead-end bound to plateau not much higher than the current level. I'm not saying it's impossible to reach AI, but it would require a paradigm shift that I'm not even able to imagine at this level of available technology.

Explicitly not relevant to the question I posed