You're sort of ignoring the issue? If the generated content was good and interesting enough on it's own, we would already have ai publishing houses pushing out entire trilogies, and each of those would be top sellers.
Generative content right now is OK. OK isn't really the goal, or what anyone wants.
I feel like this is missing the point of GenAI. I read fewer books than I did a year ago, primarily because Claude will provide dynamic content that is exactly tailored for me. I don't read many instructional books any more, because I can tell Claude what I already know about a topic and what I'd like to know and it'll create a personalised learning plan. If I don't understand something, it can re-phrase things or find different metaphors until I do. I don't read self-help books written for a general audience, because I can get personalised advice based on my specific circumstances and preferences.
The idea of a "book" is really just an artifact of a particular means of production and distribution. LLM-generated text is a categorically different thing from a book, in the same way as a bardic poem or hypertext.
First it was AI articles, raising it to entire successful book trilogies seems like a much bigger leap. Even considering the largest context windows they wouldn't directly fit and there is much less data to train context of that size on fiction than the data out there for essays and articles.
I don't think it is there yet for articles either.
My point with the Claude generated comment was maybe it could get pretty close to something like an hn comment.