I feel like this is general knowledge for the past 5 or so years, but the real question is "What do we do about it?". Personally, I put real effort into not spending time being outraged online, but this is a societal ill that's bigger then I am...
Tax and heavily regulate online advertising. The root of the problem is that it is very, very lucrative to drive engagement and until you get rid of the monetary incentive, the problem will never go away.
"Make the drug less good" likely isn't the answer. Nor is banning it.
What caused Gen Z to drink less than millenials? Maybe Gen Z has the answer.
I’m going to bet we do nothing and continue to complain instead.
The people who were voted to power (across the globe, not just the US) to do something about it are stuck getting their dopamine kicks posting garbage on the same platforms. It’s truly a terrible timeline we are in.
It’s like asking how do you get people to stop drinking alcohol
As long as there are people who don’t acknowledge or care about the health effects it will exist. If that’s a plurality of your population then you have a fundamental population problem IF you are in the group who thinks it’s bad.
Aka every minority-majority split on every issue ever.
So the answer is: live in a society governed by science. Unfortunately none exist
> "What do we do about it?"
I'd suggest something like banning algorithmic amplification - your feed is posts of people you follow and nothing else. But that's not what will happen. What will happen is there will be [1] vague laws about preventing vague "harm", written to give legal teeth to the Overton window. Not in those words, but companies that would go against it will be mired in lawfare, while those that comply will be allowed to grow.
And if you complain, they'll motte-and-bailey you - you're not in favor of "harm", are you? We're not an authoritarian speech police, we only seek to protect people from "harm".
[1] Or rather, are - see https://en.wikipedia.org/wiki/Online_Safety_Act_2023
What do we do? We treat platforms with algorithmic news feeds as publishers not platforms in the Section 230 sense.
Think about it this way: imagine if you took a million random posts or videos. You would find a wide range of political views, conspiracy theories and so on. Whatever your position on any of those issues, you could find content pushing those views.
So if your algorithm selects and distributes content that fits your desired views and suppresses content that opposes your views, how are you different from a random publisher who posts content with those exact same views?
This is kind of like the "secret third thing" of Section 230 where you get all the protections of being a platform and all the flexibility of being a publisher and we need to close that loophole. Let platforms choose which one they are.
Another example: if I create a blog and write a post that accuses my local mayor of being a drug addict and a pedophile, I can be sued for defamation. You can try the journalism defense but it won't shield you from defamation. Traditoinal media outlets are normally very careful about what they publish for this reason.
But what if I run Facebook or Twitter and one of my users says the exact same thing? Well I'm just a platform. I have a libel shield. But again, my algorithm can promote or suppress that claim. Even if I have processes to moderate that content, either by responding to a court order to take it down and/or allowing users to flag it and then take it down myself with human or AI moderation, the damage can't really be rolled back.
We've let tech companies get away with "the algorithm" being some kind of mysterious and neutral black box that just does stuff and we have no idea what. It's complete bullshit. Every behavior of such an algorithm reflects a choice made by people, period. And we need to start treating this as publishing.
[dead]
[flagged]
>"What do we do about it?"
nothing. if it isn't illegal, it isn't illegal.
previous generations of neurotics objected to many current (at the time) things we don't bat an eye about. when was the last time you saw anyone campaign against satanic music, violent video games, or hardcore pornography?
"What do we do about it?"
Shut down the behavior with regulations or shut down the companies. Meta and TikTok have no natural right to exist if they are a net negative to society.