logoalt Hacker News

rfreylast Saturday at 9:55 PM4 repliesview on HN

Maybe the very core of what it is to be human is to destroy ourselves.


Replies

agumonkeylast Saturday at 10:24 PM

maybe there's no core just unchecked forces due to technology removing barriers

show 1 reply
AlexandrBlast Saturday at 10:03 PM

Would be funny if the "great filter" is not nukes or some other weapon, but social media.

[1] https://en.wikipedia.org/wiki/Great_Filter

show 2 replies
mlrtimeyesterday at 1:10 PM

My hypothesis is: Humans are social and need social interaction to thrive. However we are not wired for the diversity of interacting with 7 Billion people and all the derivatives.

We thrive in small groups where there is high trust social networks and generally being around people with the same culture and belief system.

show 1 reply
apilast Saturday at 10:30 PM

No, our nature is to satiate our dopamine system. That system evolved to keep us fed, nourished, and to make us make friends and belong and have sex to make more humans. The problem is that we are now so smart and clever that we can start learning how the dopamine system works and hacking it.

This isn't new. We've been doing it for a long time with booze, porn, drugs, sexual excess, gambling, pointless consumerism, certain kinds of religious fervor, endless things.

But almost all of those things are self-limiting. They're either costly, dangerous, in limited supply, or physically harmful enough to our health that we shy away from them and taboos develop around them.

Addictive digital media may actually be more dangerous than those things precisely because it is cheap, always available, endless, and physically harmless. As a result it has no built-in mechanism that limits it. We can scroll and scroll and chase social media feedback loops forever until we die.

AI slop feeds are going to supercharge this even more. Instead of human creators we will have AI models that can work off immediate engagement feedback and fine tune themselves for each individual user in real time. I'm quite certain all the antisocial media companies are working on this right now. Won't be long before they start explicitly removing human creators from the loop and just generating endless customized chum with ad placement embedded into it.

Some people have the discipline to push back, but many do not either for psychological/neurological reasons or because they are exhausted and stressed and unable to summon the energy. Humans do not have infinite willpower. So I've been predicting for a while that eventually we're going to heavily regulate or tax this space.

This concerns me too due to the free speech implications and the general risk of overreacting and overcorrecting. It'll be tempting for politicians to regulate or tax only the platforms they don't like, or to use the regulatory mechanism to crack down on legitimate speech by grouping it in with addictive chum. We've seen similar things with attempts to regulate porn or hate speech. But it's coming. I have little doubt. I think we'll see this when GenZ and GenA start entering politics.

It's really still shocking to me. If you went back in time and told me in, say, 2006, that our engagement-hacking would be so successful that it became an X-risk to humanity, I'm not sure I'd believe you. I never would have believed how effective this stuff could be. It's just a damn screen for god's sake! I think a lot of people are still in denial about this problem because it seems so absurd that a touch screen can addict people as well as fentanyl, but it's true. I see it around me all the time.

Edit:

My preferred way to go about reeling this back in would be to strike at the root and start taxing advertising the way we tax booze, drugs, gambling, and other vices. Advertising revenue is the trunk of this tree. The entire reason these systems are created is to keep people staring so ads can be pushed at them. Take that away and a lot of the motive to build and run these things goes away.

Another, which we're already seeing, is to age-restrict antisocial media. Young minds are particularly vulnerable to these tactics, more so than adults, and all addiction pushers try to addict people early.

Lastly, we could start campaigns to educate people. We need schools teaching classes explaining to kids how these systems addict and manipulate them and why, and public PSAs to the same effect. It needs to be treated like a health issue because it is.

Taxes, education, and age restriction is how we almost killed cigarettes in the USA, so there is precedent for these three things together working.

We also need to be a lot more precise in our language. The problem is not the Internet, phones, computers, "tech," AI, etc. The problem is engineering systems for engagement, specifically. If you are trying to design a system to keep people staring at a screen (or other interface) for as much time as possible, you are hurting people. What you're doing is in the same category as what the Sackler family did with oxycontin. Engagement engineering is a predatory destructive practice and the people who do it are predators. I think it's taken a long time for people to realize this because, again, it's just a damn screen! It's shocking that this is so effective that we need to have this societal conversation.

show 6 replies