> If humanity goes extinct in the next few years because of unaligned superintelligence,
I've seen people claiming that this could happen, but I've yet to read any plausible scenario where this might be the case. Maybe I lack the imagination, could you enlighten me?
I've yet to read any plausible scenario where stockfish defeats me, all the scenarios my friends come up with have obvious holes in the plays they suggest stockfish could make.
- AI smarter than any human.
- AI dominated the physical world. Robots, factories, etc.
- AI decides humans aren't contributing and/or wasting resources it feels should go somewhere else.
I mean not unlike humans causing extinction of other species?
https://ifanyonebuildsit.com/