logoalt Hacker News

estearumtoday at 2:49 PM1 replyview on HN

1. "Less advanced civilization" != less intelligent people

2. The outcome of near-peer competition is surely highly dependent on factors like brutality, luck, tactics etc... the competition between the defenders of crops (i.e. makers of pesticides) and insects is not. Not only are the insects destroyed en masse successfully, but neither side even recognizes itself as party to a competition. The insect has no conception of a crop, even when he walks in it, much less a pesticide, even when he tastes it. The pesticide sprayer assigns zero moral valence to his daily genocide.

Do you have a reason to believe the gap between AI (not LLMs specifically, but AI generally) and human intelligence will peak near the difference between human competitors (what... 20-30 IQ points)?

If so, please share why you believe this.


Replies

miroljubtoday at 3:06 PM

> Do you have a reason to believe the gap between AI (not LLMs specifically, but AI generally) and human intelligence will peak near the difference between human competitors (what... 20-30 IQ points)?

So we established that competing human civilizations differ by 20-30 IQ points? Sounds reasonable.

> If so, please share why you believe this.

Basically two reasons:

1. there's no AI. There are LLMs, which basically do pattern matching on increasingly LLM generated data set. That inevitable leads to a local maximum where every advance is increasingly difficult for decreasing gain in "intelligence"

2. the energy required to reach an ever increasing level of "intelligence" (or let us just call it pattern matching performance) quickly becomes so huge that it's simply not sustainable.

I think the current LLM approach is a dead-end bound to plateau not much higher than the current level.

I'm not saying it's impossible to reach AI, but it would require a paradigm shift that I'm not even able to imagine at this level of available technology.

show 1 reply