Out of curiosity, what are you basing this on?
The text has few of the obvious AI tells. The only thing that, to me, looks characteristic of LLM-generated text is the short and terse sentence structure, but this has been a "prestigious" way to write in English since Hemingway.
What are the obvious tells? List them, because I think our sense of the tells may not overlap.
This article is clearly LLM-generated, even the title. A key indicator is that it almost makes sense: we forgot how to manufacture because that got sent to a different nation. The coding thing isn’t getting sent anywhere, so humanity is forgetting how to code. The distinction undermines a lot of the emotional baggage about offshoring that the article wants you to bring along.
The blog post reads nothing like Hemingway. Here's a classic example: https://anthology.lib.virginia.edu/work/Hemingway/hemingway-...
Hemingway writes simple sentences with a kind of detachment to make the emotional flow of his stories as transparent as possible.
LLM slop reads more like slide bullet points extrapolated to prose-length text
https://awnist.com/slop-cop (via https://news.ycombinator.com/item?id=47806845) points out Staccato Burst, Dramatic Fragment, Colon Elaboration, and Short-Hook Paragraph. To me, those define the tone of this article.
Blog posts aren't typically written like Hemingway.
Find some pre 2020 that are, and you'd have a point.
Sort of a taste receptor I’m sure many have developed now.
The most obvious patterns here are: antithesis constructions, words choices and distribution, attempt at profundity in every paragraph but instead are runs of text that doing say anything, and even the perfect use of compound hyphenation. I think and can appreciate that there is definitely an attempt at personalization and guidance to make it less LLM-y and not just a default prompt, but it’s still kind of obvious. You could use a detector tool too of course.