Well... if something being AGI means it's at least on par with a human or a team of humans, then having access to an additional team of humans for 6 months isn't that big of a deal. It's useful, yes, but would you consider that to be world-changing? Not really, right? ASI is slightly more interesting, but I doubt ASI comes from a single model, but rather the coordinated deployments of millions of AGI. Just like how as individuals, as great as we are, we're pretty limited, but the entire collective of humanity is pretty insane. To my mind, a frontier lab might hit AGI, but it won't be a frontier lab that hits ASI, rather that'll be a natural byproduct of mass deployment of AGI over a certain window of time. There will be no controlling it either. No one controls all of earth. You just can't. ASI will be a distributed system.
Well... if something being AGI means it's at least on par with a human or a team of humans, then having access to an additional team of humans for 6 months isn't that big of a deal. It's useful, yes, but would you consider that to be world-changing? Not really, right? ASI is slightly more interesting, but I doubt ASI comes from a single model, but rather the coordinated deployments of millions of AGI. Just like how as individuals, as great as we are, we're pretty limited, but the entire collective of humanity is pretty insane. To my mind, a frontier lab might hit AGI, but it won't be a frontier lab that hits ASI, rather that'll be a natural byproduct of mass deployment of AGI over a certain window of time. There will be no controlling it either. No one controls all of earth. You just can't. ASI will be a distributed system.