logoalt Hacker News

matthewdgreenlast Monday at 6:54 PM1 replyview on HN

Eliezer Yudkowsky has gone so far as to say that it might be ok to kill most of humanity (excepting a "viable reproduction population") to stop AI. If that's not just talk, then this line reasoning only gives you a few possible modes of action. I would not be worried about the people with Molotov cocktails, but I'd be very worried about bio terrorism.


Replies

hollerithlast Monday at 7:48 PM

>Eliezer Yudkowsky has gone so far as to say that it might be ok to kill most of humanity (excepting a "viable reproduction population") to stop AI

That doesn't sound like a non-misleading summary of anything he would say. Do you have a quote or a link?

show 2 replies