logoalt Hacker News

FuriouslyAdriftlast Wednesday at 6:09 PM0 repliesview on HN

LLMs are lossy compression of a corpus with a really good parser as a front end. As human made content dries up (due to LLM use), the AI products will plateau.

I see inference as the much bigger technology although much better RAG loops for local customization could be a very lucrative product for a few years.