logoalt Hacker News

bananaflagyesterday at 11:23 AM2 repliesview on HN

> If someone wrote a definition of AGI 20 years ago, we would probably have met that.

No, as long as people can do work that a robot cannot do, we don't have AGI. That was always, if not the definition, at least implied by the definition.

I don't know why the meme of AGI being not well defined has had such success over the past few years.


Replies

bonplan23yesterday at 12:45 PM

"Someone" literally did that (+/- 2 years): https://link.springer.com/book/10.1007/978-3-540-68677-4

I think it was supposed to be a more useful term than the earlier and more common "Strong AI". With regards to strong AI, there was a widely accepted definition - i.e. passing the Turing Test - and we are way past that point already: ( see https://arxiv.org/pdf/2503.23674 )

show 1 reply
Closiyesterday at 11:30 AM

Completely disagree - Your definition (in my opinion) is more aligned to the concept of Artificial Super Intelligence.

Surely the 'General Intelligence' definition has to be consistent between 'Artificial General Intelligence' and 'Human General Intelligence', and humans can be generally intelligent even if they can't solve calculus equations or protein folding problems. My definition of general intelligence is much lower than most - I think a dog is probably generally intelligent, although obviously in a different way (dogs are obviously better at learning how to run and catch a ball, and worse at programming python).

show 1 reply