logoalt Hacker News

A_D_E_P_Ttoday at 9:30 AM5 repliesview on HN

If it weren't the New Yorker, I'd swear up and down that Claude wrote this:

> Turbulence is rarely that simple. It’s too scattered, too mercurial, too easily triggered by weather patterns that trigger other patterns in an endless cascade. “It’s not just one thing that’s going on,” Bob Sharman, an atmospheric scientist at NCAR, told me. “It’s not just atmospheric convection. It’s not just wind flowing over mountains. It’s everything going on all the time and interacting.”

> “It’s not a piece of farm equipment,” Larson said. “It’s a life-support system. At thirty-five thousand feet, you can’t pull over.”

The funny thing is that the passages that feel the most "AI-generated" come in quotes, when the author is quoting others. It could be that the author was communicating with those experts via email, and they used AI to generate their responses.

Otherwise, I think that AI language patters are diffusing into common use. Being so aware of them is a curse...


Replies

notarobot123today at 10:15 AM

It isn't only LLMs that use rhetorical constructs like these, humans use them too.

show 2 replies
birdsongstoday at 11:55 AM

Not arguing, it's just interesting I read the opposite, it felt human to me, and I usually have a good spidey sense here. Maybe it's a combo of handwritten and LLM polishing? Or just a case of a good writer, whose typical output was the training input for most of these models. Good writing, novels and articles and short stories, were the high value training sets.

"For a moment, the plane quivered around them like a greyhound straining on a leash." - I don't think a LLM would write this.

But hell, maybe I'm just being naive. I think we're past the point of ambiguity, we just can't know anymore. Which feels poignant to me.

forgetfreemantoday at 10:41 AM

You're reversing causality here. LLMs train on massive bodies of human-generated content. Constructs like the ones mentioned are an entirely unremarkable staple of long-form text content produced for audiences who are accustomed to consuming long-form text content.

show 1 reply
FreakLegiontoday at 10:24 AM

People point to the basic structure of "It's not X, it's Y" as the hallmark of AI, but I find it's more the incongruity between X and Y, especially when figures of speech (invariably strained) are involved[1]. That first quote reads like a real interaction that's been tightened up for print, but the second, the 'farm equipment' <> 'life-support system', does smell like AI, even though the article implies it's from an in-person conversation.

1. These are all from a single 850-word op-ed I saw the other day: "Presidents do not usually lose power because of a single speech. They lose power when a speech reveals something structural." "But the most important part of the speech was not the applause lines. It was the compression." "Markets can rise. But voters do not live inside charts. They live inside grocery stores and mortgage payments." "The issue is not whether a statistic was stretched. The issue is that the presidency becomes reactive instead of agenda-setting." "That friction is not theoretical — it is baked into the constitutional design." "Trump’s address was not a pivot to persuasion — it was a doubling down on confrontation as strategy." "They are not just another campaign cycle. They are leverage."

show 1 reply
Copyrightesttoday at 12:46 PM

[dead]