What rapid acceleration?
I look at the trajectory of LLMs, and the shape I see is one of diminishing returns.
The improvements in the first few generations came fast, and they were impressive. Then subsequent generations took longer, improved less over the previous generation, and required more and more (and more and more) resources to achieve.
I'm not interested in one guy's take that LLMs are AGI, regardless of his computer science bonafides. I can look at what they do myself, and see that they aren't, by most very reasonable definitions of AGI.
If you really believe that the singularity is happening now...well, then, shouldn't it take a very short time for the effects of that to be painfully obvious? Like, massive improvements in all kinds of technology coming in a matter of months? Come back in a few months and tell me what amazing new technologies this supposed AGI has created...or maybe the one in denial isn't me.
> I look at the trajectory of LLMs, and the shape I see is one of diminishing returns
It seems even more true if you look at OpenAI funding thru 2022 initial public release to how spending has exponentially increased to deliver improvements since. We’re now talking upwards of $600B/yr of spending on LLM based AI infrastructure across the industry in 2026.