logoalt Hacker News

afpxyesterday at 2:38 AM2 repliesview on HN

I can't see how AGI can happen without someone making a groundbreaking discovery that allows extrapolating way outside of the training data. But, to do that wouldn't you need to understand how the latent structure emerges and evolves?


Replies

YZFyesterday at 2:49 AM

We don't understand how the human brain works so it's not inconceivable that we can evolve an intelligent machine whose workings we don't understand either. Arguably we don't really understand how large language models work either.

LLMs are also not necessarily the path to AGI. We could get there with models that more closely approximate the human brain. Humans need a lot less "training data" than LLMs do. Human brains and evolution are constrained by biology/physics but computer models of those brains could accelerate evolution and not have the same biological constraints.

I think it's a given that we will have artificial intelligence at some point that's as smart or smarter than the smartest humans. Who knows when exactly but it's bound to happen within lessay the next few hundred years. What that means isn't clear. Just because some people are smarter than others (and some are much smarter than others) doesn't mean as much as you'd think. There are many other constraints. We don't need to be super smart to kill each other and destroy the planet.

show 1 reply
weregiraffeyesterday at 7:04 AM

Aka: magic spell that grants you infinite knowledge.

Why do people believe this is even theoretical possible?