There was value in leaded gas and asbestos insulation too, nobody denies that.
You're blind to all the negative side effects, AI generated slop ads, engagement traps, political propaganda, scams, &c. The amount of pollution is incredible, search engines are dead, blogs are dead, YouTube is dead, social medias are dead, it's virtually impossible to find non slop content, the ratio is probably already 50:1 by now
And these are only the most visible things, I know a few companies losing hundreds of hours every month replying to support tickets that are fully llm generated an more often than not don't make any sense. Another big topic is education.
> You're blind to all the negative side effects
I didn't comment on negative effects one way or the other.
Most of the problems you point out are enabled and motivated by conflict of interest business models. I.e. surveillance that enables and incentivizes targeted ads, targeted manipulation, and addictive media.
Parasitical, privacy violating and dark pattern business models are like bad security. We can't afford them any more. AI is going to make that already very clear point clearer and clearer ... until maybe people wake up and legislate those types of businesses away.
Blaming AI (for those kinds of problems) won't achieve anything. As with any tool, AI is just getting more effective. Might as well blame faster processors too.
And it gives the actual culprits, the people who profit from poison, a pass and cover.
Profitable, scalable, conflicts of interest are poison.
Ironically, if generative AI ends up killing social media, it might actually be a net positive. How do you avoid engaging with AI content? Why, go find an actual human IRL and speak with them.