logoalt Hacker News

TheAceOfHearts11/04/20255 repliesview on HN

> At the core of most definitions you’ll find the idea of a machine that can match humans on a wide range of cognitive tasks.

I expect this definition will be proven incorrect eventually. This definition would best be described as a "human level AGI", rather than AGI. AGI is a system that matches a core set of properties, but it's not necessarily tied to capabilities. Theoretically one could create a very small resource-limited AGI. The amount of computational resources available to the AGI will probably be one of the factors what determines whether it's e.g. cat level vs human level.


Replies

pants211/05/2025

While we're posting our favorite definitions of AGI, I like what Francois Chollet once said:

"AGI is reached when it’s no longer easy to come up with problems that regular people can solve … and AIs can’t."

dr_dshiv11/04/2025

That’s like Peter Norvig’s definition of AGI [1] which is defined with respect to general purpose digital computers. The general intelligence refers to the foundation model that can be repurposed to many different contexts. I like that definition because it is clear.

Currently, AGI is defined in a way where it is truly indistinguishable from superintelligence. I don’t find that helpful.

[1] https://www.noemamag.com/artificial-general-intelligence-is-...

show 1 reply
Dylan1680711/05/2025

That definition gives me a headache. If it's not up to the level of a human then it's not "general". If you cut down the resources so much that it drops to cat level, then it's a cut down model related to an AGI model and no more.

turtletontine11/04/2025

What does this even mean? How can we say a definition of “AGI” is “correct” or “incorrect” when the only thing people can agree on, is we don’t have AGI yet?

show 1 reply