> ... and has an idea of how they work shouldn't think its going to lead to "AGI"
Not sure what level of understanding are you referring to but having learned and researched about the pretty much all LLM internals I think this has led me exactly to the opposite line of thinking. To me it's unbelievable what we have today.