logoalt Hacker News

matthewdgreenyesterday at 6:40 PM1 replyview on HN

You have a population of relatively wealthy, scientifically-educated people who believe that AI risk is real and existential. That if they/we don't act, humanity itself might become extinct -- and that this is an unacceptable outcome. Then you have Yudkowsky mooting the possibility that this is basically inevitable (in the absence of global coordination that seems highly unlikely), and suggesting that hyper-violent outcomes might be literally the only way our species survives.

What I am not saying: Yudkowsky intends to exterminate most of humanity.

What I am saying: this is a dangerous environment, and these kinds of statements will be seen as a call to action by a certain kind of person. TFA is literal proof of the truth of that statement. Moreover: within the community there exist trained experts who might be able to, at the cost of millions of lives, plan an attack that could (plausibly) delay AI by many years.

The danger of this argument is that someone who reveres Yudkowsky might take his arguments to the logical conclusion, and actually do something to act on them. (Although I can't prove it, I also think Yudkowsky knows this, and his decision to speak publicly should be viewed as an indicator of his preferences.) That's why these conversations are so dangerous, and why I'm not going to give Yudkowsky and his folks a lot of credit for "just having an intellectual argument." I think this is like having an intellectual discussion about a theater being on fire, while sitting in a crowded theater.


Replies

janalsncmyesterday at 8:15 PM

I said something to the same effect in a sibling comment to yours.

> someone who reveres Yudkowsky might take his arguments to the logical conclusion

What about Eliezer himself? Does he not believe his own rhetoric? Certainly if he believes the future of the human race is at stake it demands more action than writing a book about it and going on a few podcasts.

I think the whole thing is a bit like the dog who finally caught the car. It’s easy to use this strident rhetoric on an Internet forum, but LessWrong isn’t real life.

show 1 reply