I bookmarked this comment a few months ago because I thought it was hilarious and increasingly accurate:
It's approaching a very strange situation where people make overly wordy and bloated AI generated content and other people try to use AI to compress it back into useful pellets vaguely corresponding to the actual prompts used to generate the initial content. Which were the only bits anybody cared about in the first place. One guy pays the AI to dig a hole, the other guy pays the AI to fill in the hole. Back and forth they go, raising the BNP but otherwise not accomplishing anything.
https://news.ycombinator.com/item?id=41635079
More seriously though; I wonder if/when we will reach a point at which asking for a Neuromancer-esque précis summary video of a topic will replace the experience of browsing and reading various sources of information. My gut feeling is that it will for many, but not all scenarios, because the act of browsing itself is desirable and informative. For example, searching for books on Amazon is efficient but it doesn’t quite replace the experience of walking through a bookstore.
It is really interesting because I have been using LLMs a lot for reading legalese documents. At least in the realm of Legal matters we already have this dynamic of "people make overly wordy and bloated LAWYER generated content and other people try to use LAWYERS to compress it back into useful pellet"
So at least for legal documents this LLM craze is a big improvement! It is much harder to out-spend other people on LAWYER stuff now.
> One guy pays the AI to dig a hole, the other guy pays the AI to fill in the hole. Back and forth they go, raising the BNP but otherwise not accomplishing anything.
Not accomplishing anything would be better than what is actually happening. Like with the hole example, once you fill it back up there’s a good chance you can still tell a hole was dug in that place.
What does “BNP” stand for in this context?
Reminds me of this comic: https://preview.redd.it/v1ylid5639ra1.png?width=1024&auto=we...
In a culture where pointless busywork is seen as mandatory to appear proper, people will eventually automate it.
I feel the real problem comes when people stop publishing on the (open) web because 1) no-one is reading it directly and 2) they know their hard work will just get slurped up and regurgitated by LLMs.
For things like reviews I usually get a lot more value from a quick manual scan than an AI overview of any given review. Summaries of many reviews could be useful but if there are, e.g., thousands of reviews, I find myself skeptical of how truly "thorough" or well-executed that AI summary is anyway.
For "how do I write a bash script that will do X" the AI summary currently is way better than scanning a handful of StackOverflow tabs, already.
It will be interesting to see how "fresh" things like that stay in the world of newer or evolving programming languages. This is one of the areas where I already see the most issues (methods that no longer exist, etc).
> It's approaching a very strange situation where people make overly wordy and bloated AI generated content and other people try to use AI to compress it back into useful pellets vaguely corresponding to the actual prompts used to generate the initial content. Which were the only bits anybody cared about in the first place. One guy pays the AI to dig a hole, the other guy pays the AI to fill in the hole. Back and forth they go, raising the BNP but otherwise not accomplishing anything.
Dead Internet Theory in a nutshell.
https://en.wikipedia.org/wiki/Dead_Internet_theory#:~:text=T....
W.Gibson and N.Stephenson have proven right for so many things, like we live in their books. I wouldn't be surprised if they got this one right also, and it sounds plausible.
Buuut, then, still, my significant other loves watching Friends, which was released she was born, and is not rewinding. So it depends.
And this state of things may not be that bad in and of itself but it will make the airbrushing of various topics for propaganda or therapeutic purposes very easy.
Saying it only raises the BNP is incorrect - it also generates waste heat and carbon dioxide.
For shits and giggles, I passed that quoted paragraph into chatgpt to expand it to make it overly wordy and bloated (but limited to three paragraphs because I have limited patience), and also to compress it down to a single sentence:
> We are increasingly finding ourselves in a peculiar situation where the use of artificial intelligence is creating an ironic cycle of excess and reduction. On one side, AI is being employed to generate content that is often overly verbose and bloated, as algorithms churn out text that fills space rather than conveying concise or meaningful information. This output, while perhaps technically impressive in its sheer volume, often fails to serve the core purpose of clear and direct communication. It may contain a great deal of data, but much of it is irrelevant or overly embellished, making it difficult for the reader to extract anything of value. Essentially, the AI is tasked with expanding ideas into sprawling narratives or articles that only obscure the original intent.
> On the other hand, there are those who are now turning to AI to reverse this inflation, trying to distill these bloated pieces of content into more digestible, efficient versions. These AI-driven tools aim to compress the original text, stripping away the superfluous language and presenting a more focused, streamlined summary that more closely reflects the essence of the original prompt. However, this approach often feels like a futile exercise in trimming down something that was never necessary in the first place. The irony lies in the fact that the only parts people ever truly cared about—the core ideas, the relevant insights, the key messages—were buried under layers of unnecessary verbiage in the first place, only to be painstakingly uncovered and reorganized by another layer of AI intervention.
> In a sense, this back-and-forth process resembles an endless cycle of creation and destruction, where one person pays an AI to dig a hole, and another pays it to fill the hole back in. The end result may look like progress on paper—content is created, then refined, revised, and streamlined—but in reality, very little of substance is actually achieved. The net effect is often minimal, with people endlessly tweaking and refining information, but ultimately not advancing the core objective of clear communication or meaningful progress. This cycle may inflate the BNP (bloat-and-purge narrative process) without producing any tangible results, leaving us with more text, more noise, and less clarity than we had before.
And reduced again:
> The current trend sees AI generating bloated, verbose content that others then compress back into useful summaries, creating an endless cycle of inflation and reduction that accomplishes little beyond adding noise and complexity to what was originally a simple idea.
Its going to be the "Melons rot in the warehouse as the central committee can not come up with a distribution plan" moment of capitalism. Which of course is not a pure, real ism, while being as pure and real as an ism, as it gets with humanity as executing virtual machine.
It's approaching a very strange situation where people make overly wordy and bloated AI generated content and other people try to use AI to compress it back into useful pellets vaguely corresponding to the actual prompts used to generate the initial content.
ah yes the reverse autoencoder
So basically, lossy compression at a huge energy expense. Thanks, AI geniuses!
Isn't this an advancement in communication though? People can put out a message in whatever language or style they prefer, the machine translators translate it into an overly verbose AI vomit, and readers condense it back down into the exact kind of personalized language they're wanting to consume.
What that comment is describing is already here. A colleague sent an email that was obviously AI-generated (bloated, repetitive, low signal-to-noise ratio). I guess he's quite new to the team and it's a sign of formality, but I really don't mind if you send me just the bullet-point notes... Why are we going through this encoding-decoding process? I think succinctness and low-noise writing will be treasured in the age of AI.