logoalt Hacker News

atleastoptimaltoday at 6:22 PM6 repliesview on HN

There really are only 3 options that don't involve human destruction:

1. AI becomes a highly protected technology, a totalitarian world government retains a monopoly on its powers and enforces use, and offers it to those with preexisting connections: permanent underclass outcome

2. Somehow the world agrees to stop building AI and keep tech in many fields at a permanent pre-2026 level: soft butlerian jihad

3. Futurama: somehow we get ASI and a magical balance of weirdness and dance of continual disruption keeps apocalypse in check and we accept a constant steady-state transformation without paperclipocalypse


Replies

bigfishrunningtoday at 8:10 PM

Scenario 2 makes the assumption that no technological development can happen without AI, which seems like a stretch to me. Honestly, the worst scenario i can think of is 40ish years of AI assisted development followed by a technological crash due to there being no competent engineers left to fix the slop.

show 1 reply
tomjen3today at 6:38 PM

This makes the assumption that AI will lead to the apocalypse. That's unfalsifiable, predicted about plenty of things in the past, and frankly annoying to keep seeing pop up.

Its like listening to Christians talking about the rapture.

show 1 reply
nyc_data_geek1today at 6:38 PM

Cool story, bro!

cindyllmtoday at 6:44 PM

[dead]

raincoletoday at 6:24 PM

In other words, only one option.