AI slop is ruining the current internet, including forums, email, blogs, announcements, and much of the remaining content. I say "current internet" because we will adapt as we always have, but many things that were formerly useful or interesting will be buried in so much crap that it will stop being something that people use the internet for.
At the dawn of email, I could and did cold email professors, and they would respond based on whether my query was worth responding to. I put effort into my messages (and had a reason, I wasn't just trying to elicit responses), and my success rate was very high. It wasn't scale that killed that, it was spam and greed. (There's overlap, but by spam I mean unsolicited commercial email, and by greed I mean people blasting out large number of low-effort messages in an attempt to gain something.) Professors are still interested in meaningful correspondence, but email is no longer a usable communication medium unless they already know their correspondent.
AI applies the same dynamic to many more forms of content. Individually, it doesn't do much harm. In aggregate, the meaning and value are rapidly being destroyed.
It's kind of ironic -- in the early days of online communication, there was endless hand-wringing over all the cues and subtext that we've lost from face-to-face communication. Now we take that loss as a given, and have collectively decided to attenuate the signal even more.
I wouldn't advocate for AI to just go away in all domains. It's a cool and useful technology. But I personally would prefer if representing AI output as your own writing were looked upon roughly the same way as having a secretary write all of your correspondence. Well, a little worse -- it's like have an arbitrarily chosen secretary from a worldwide pool write each item of correspondence. If I ruled the internet, that's where I would set social norms and expectations. People could still use it for translation, but it would be a major faux pas to not divulge your use of AI if there is reason to believe you wrote it yourself. Sure, there would have to be many judgement calls -- if you get an AI's advice on how to say something and then reprocess it into your own words, for me that'd depend on how real that reprocessing is. But that's nothing new, it's just another form of the plagiarism slippery slope.
Sadly, I do not rule the internet, and it's a lost cause.
Whether it's the person using AI or AI itself that is responsible? That's a non-sequitur. I don't care. Describe it how you like. I'm describing the effect, not assigning blame.