I heard someone on a podcast call social media algorithms "the modern-day cigarette" and that really resonated with me. These companies know their product is addictive and bad for users, but they keep pushing it anyways. Like cigarettes, it's bad for everyone, not just kids. I made an algorithm blocker for Safari because of that and it's actually crazy how much more pleasant social media is if you don't have recommendation algorithms at all. I think the EU and other jurisdictions should really look beyond just limiting this stuff to kids, but I understand why it's starting there...
The modern-day cigarette is such a perfect metaphor for social media. A cabal of unfathomably wealthy companies spreading their harmful products across the world; making them as addictive as possible while actively burying the research which proves how harmful they are. I truly hope one day we'll look back on social media and smartphone use the same way we regard smoking.
I'm not the first person to notice this but, since I switched to Pixelfed and Mastodon, I've found that I just don't spend as much time on social media as I used to. It's not that I don't follow good people, but without the algo burrowing into my lizard brain to get to keep me swiping, I just don't think about it. When I do remember to check them, it's always pleasant. I check a few posts out, look at an interesting link and 20 minutes later I'm back to the real world. That's great for me the user, but I doubt you can build an ad driven business off that. I wish I could say that I'm savy enough to not get sucked into swiping through scores of "funny" videos, but I give an hour to that crap, the hour is gone and I have nothing to show for it.
Look up images in Google with `eu cigarettes boxes`. Banning is a thin wedge, but I think we need something like these warning labels for social media.
Like the tobacco industry, they have confidential memos about how to target children whilst simultaneously claiming they would never, ever target children.
And then pushed hard for legislation to make it someone else’s problem (like when the tobacco industry astroturfed for laws to make it illegal to sell under-18 cigarettes, after their own research showed it wouldn’t make much of a difference on youth smoking rates and would also improve their image as a “rebellious” thing to do). Sound familiar with Meta’s big push to have your OS declare how old you are?
Funny, just heard an interview where the guest said that nowadays more people feel bad about scrolling than about smoking cigarettes.
This is still a recommendation algorithm, just less enjoyable/addicting one. Any process by which you decide what to show to a user is an algorithm .
for me the problem is the amount of junk or seen stuff I have to filter to get to the genuine posts, it's like I'm going in the store to get a pack of cigs and I'm presented with an infinit amount of unkown brands, flavours and quality levels and I have no idea what to buy
The problem is when the product becomes an optimization machine for attention
I mean, I'd argue it's worse. Cigarettes don't run your communication networks, and aren't a functional necessity for businesses to advertise their services.
On that note, not that I think regulation is the entire solution in the first place (see ATProto for an example of something independent of government that gives me hope for the Internet), but I feel that where a lot of the "protect kids" Internet bills fail is that many of them treat that as a separate, special concern in a lot of areas where they could cover it anyway by just trying harder to protect users.
In the US, where I'm writing this, it's sort of like how our age discrimination laws are written just to protect elders, but didn't do anything to protect them from the lower floor that came from letting businesses keep spreading stereotypes about who the minimum wage is for or otherwise pushing hustle culture onto 20somethings.
The use of the Internet to astroturf political discourse is an example of this -- you can't fully protect kids from school shootings with an Internet safety bill if you're not also going after bot farms that exist to benefit the "thoughts and prayers" crowd. But you're also never going to see that in an Internet safety bill for kids, because that (and for that matter a lot of our discourse about addictive mechanics in general) explicitly leaves out voters.
(clarifying edit: I'm not saying there aren't valid concerns around this topic. I am saying that when we say things like "experimenting on users' mental health without their knowledge is bad," the baseline should be that you don't have to add anything to the sentence for it to be taken seriously.)
Agreed. Imagine if you had to open a pack of cigarettes every time you wanted to check the weather… then blamed people for being addicted to nicotine.
I think this is even more applicable, because many people younger than me do regulate their social media use, taking "detoxes" or having a more limited use of it altogether, and they are more likely to have a social circle that reinforces that
It reminds me of how I have never been tempted to use a cigarette or any nicotine product and view them as nasty, while me being a little kid telling an addicted adult "you know, those are bad for you" was met with a shrug or I can quit any time, as their social circle and support system was based on using it
Makes me think my generation is cooked when it comes to social media use
Huh, and like the cigarette, even though I feel like I see what the appeal is supposed to be, I just cannot get over how gross it actually is to engage with, and feel like I’m already ‘over it’ and am just waiting for everyone else to figure it out.
Glad to hear a false comparison to something that's actually physically/chemically addictive really resonated with you (a.k.a. affirmed your already existing beliefs in this moral panic).
If we step back and look at this rationally though, can anybody point me to any peer reviewed studies (the actual studies, not clickbait articles written based off the studies) showing that social media is anywhere near as physically harmful or addictive as cigarettes?
I'm totally open to the idea that engagement algorithms are inflaming social division. I'm less convinced that the children are the ones being harmed however. I think its the adults who grew up in a media mono-culture where the default was trust are the ones more susceptible to negative outcomes.
When things change, the young are the ones more likely to adapt.
I agree with the cigarette analogy up to a point, but the UX consequences are easy to understate.
A lot of what makes these products feel “good” in the moment is exactly what regulators may end up targeting: no stopping points, instant continuation, algorithmic relevance, autoplay, low-friction notifications. If you remove or weaken those things, many users will probably experience the result as worse UX, even if the policy goal is reasonable.
So the hard part is not just “ban addictive design”. It is deciding which kinds of friction are legitimate product safety, and which ones become the digital equivalent of cookie banners: technically protective, but broadly annoying, ignored, and eventually hostile to normal use.
Starting with kids makes sense politically and morally. But if the regulatory logic is “this is bad for everyone, not jus minors”, then adult UX probably will get pulled into it too.
[flagged]
That rationale never convinced me.
Smoking has definite physiological effects. Molecules bind to receptors or neurons and initiate cascades/responses.
I don't see this with user interface in a browser at all. IF you wish to reason for that, why are regular ads allowed? They piss me off. Why do I have to see them? They cause my brain an addiction to want to buy crappy products. So why is there no ban here?
Let's face it - the EU is on a path of "Minority Report" here.
> I think the EU and other jurisdictions should really look beyond just limiting this stuff to kids
Yeah they try to restrict what we can do. We oldschool people call this fascism. See the EU trying to destroy VPN. And this is a meta-strategy we see here - many lobbyists are activated and try to "sync" laws that never made any sense to as many countries as possible. I see where corruption happens. And I don't buy the "we protect kids" fake lie for a moment.
If you didn’t notice, this comment is an ad for a paid app trying to capitalize on social media anger. I respect the hustle, but this is not a neutral comment on the topic due to the financial interest. There are many free alternative plugins for targeting social media feeds if someone wants to filter these.