I agree, but I think there's a second relevant question as well: Does generative AI output original creative works of its own? Obviously the goal here should be maximizing societal benefit. Specifically encouraging the creation of works that we (as a group) find directly useful or otherwise desirable. At least to my mind, human exceptionalism is an explicit non-goal.
I'm already finding the ability of LLMs to synthesize useful descriptions across disparate sources of raw data to be immensely useful. If that puts (for example) scientific textbook authors out of a job I'm not at all sure that would prove to be a detriment to society on the whole. I'm fairly certain that LLMs are already doing better at meeting the needs of the reader than most of the predatory electronic textbook models I was exposed to in university.
> If the answer is "no", which it clearly is,
Why are you so certain of this? It clearly breaks many (most?) of the existing revenue models to at least some extent. But we don't care about the existing revenue models per se. What we care about is long term sustainable creation across society as a whole. So are consumer needs being met in a sustainable manner? Clearly generative AI is (ever increasingly) capable of the former; it's the latter that requires examination.