logoalt Hacker News

pyuser583yesterday at 10:39 PM1 replyview on HN

Here’s another take: passing LLM output as human makes it harder to train future LLM on the text.

It makes the human generated text much less valuable.


Replies

8cvor6j844qw_d6yesterday at 11:17 PM

There's a flip side to this though.

LLM output mixed into someone's writing is great for hampering stylometric analysis. Running your text through an LLM before publishing can muddy the stylistic fingerprint that would otherwise link it back to you.

The "pollution" of human text is an issue for some, but a feature from a privacy perspective.