logoalt Hacker News

sphtoday at 2:45 PM3 repliesview on HN

This is exactly why artificial super-intelligences are scary. Not necessarily because of its potential actions, but because humans are stupid, and would readily sell their souls and release it into the wild just for an ounce of greed or popularity.

And people who don't see it as an existential problem either don't know how deep human stupidity can run, or are exactly those that would greedily seek a quick profit before the earth is turned into a paperclip factory.


Replies

xrdtoday at 3:27 PM

I love this.

Another way of saying it: the problem we should be focused on is not how smart the AI is getting. The problem we should be focused on is how dumb people are getting (or have been for all of eternity) and how they will facilitate and block their own chance of survival.

That seems uniquely human but I'm not a ethnobiologist.

A corollary to that is that the only real chance for survival is that a plurality of humans need to have a baseline of understanding of these threats, or else the dumb majority will enable the entire eradication of humans.

Seems like a variation of Darwin's law, but I always thought that was for single examples. This is applied to the entirety of humanity.

show 4 replies
bckrtoday at 4:00 PM

Look, we’ve had nukes for almost 100 years now. Do you really think our ancient alien zookeepers are gonna let us wipe with AI? Semi /j

GistNoesistoday at 4:35 PM

It's even worse than that.

The positives outcomes are structurally being closed. The race to the bottom means that you can't even profit from it.

Even if you release something that have plenty of positive aspects, it can and is immediately corrupted and turned against you.

At the same time you have created desperate people/companies and given them huge capabilities for very low cost and the necessity to stir things up.

So for every good door that someone open, it pushes ten other companies/people to either open random potentially bad doors or die.

Regulating is also out of the question because otherwise either people who don't respect regulations get ahead or the regulators win and we are under their control.

If you still see some positive door, I don't think sharing them would lead to good outcomes. But at the same time the bad doors are being shared and therefore enjoy network effects. There is some silent threshold which probably has already been crossed, which drastically change the sign of the expected return of the technology.