This is meant to be some kind of Chinese room argument? Surely a 1e18 context window model running at 1e6 tokens per second could be AGI.
"Surely a 1e18 context window model running at 1e6 tokens per second could be AGI."
And why?
This argument works better for state space models. A transformer would still steps context one token at a time, not maintain an internal 1e18 state.
Personally I'm hoping for advancements that will eventually allow us to build vehicles capable of reaching the moon, but do keep me posted on those tree growing endeavors.