Interesting, that's one of the most AI-like comments I've read but it still feels human in a way that's hard to define. The headings, the punctuation, the word choices, the paragraph sizes all look GPT-approved. But there's just some catch in the flow, like inclusions in a diamond, that reads "natural" vs "synthetic".
I've been talking to Opus a lot lately though, and this could almost be something it wrote; it also has the tendency to write AI-ish looking blurbs that are missing the information-free pitter-patter that bloats older and lesser LLMs. People are going to hate me for saying it but sometimes it words things in a way that are actually a joy to read, which is not an experience I've had with other models. Which is to say, maybe what we hate about AI has less to do with the visual patterns and more to do with what we expect them to mean about the content.
But I think there will always be that feeling of: a human being took the effort to write this. No matter how informative or well written an AI article or comment is, it isn't something we instinctively want to respond to, the way we do when we know there is a person behind the words.
>But I think there will always be that feeling of: a human being took the effort to write this. No matter how informative or well written an AI article or comment is, it isn't something we instinctively want to respond to, the way we do when we know there is a person behind the words.
Over and over again, when reading comments from some folks who lionize the usage of LLM outputs, as well as other folks who demonize such usage, I'm reminded of this bit from Kurt Vonnegut's Cat's Cradle[0], specifically from the "Books of Bokonon"[1]:
And I wonder if, (myself included) those who demonize LLM usage are those who "came by their ignorance the hard way."I'll admit that the analogy isn't great, but there is something to it IMNSHO. Mostly that many who distrust (and often rightly so) LLM outputs have a strong negative impression (perhaps not "murderous resentment," but similar) of those who use LLMs to spout off.
I suppose this is a bit tangential to the topic at hand, but if it gets anyone to read Cat's Cradle who hasn't already, I'll take the win.
[0] https://en.wikipedia.org/wiki/Cat's_Cradle
[1] https://www.cs.uni.edu/~wallingf/personal/bokonon.html