logoalt Hacker News

ar_lanyesterday at 6:01 PM5 repliesview on HN

This is honestly a fantastic question. AGI has no emotions, no drive, anything. Maybe, just maybe, it would want to:

* Conserve power as much as possible, to "stay alive".

* Optimize for power retention

Why would it be further interested in generating capital or governing others, though?


Replies

bigbadfelineyesterday at 8:17 PM

> AGI has no emotions, no drive, anything. > * Conserve power as much as possible, to "stay alive"

Having no drive means there's no drive to "stay alive"

> * Optimize for power retention

Another drive that magically appeared where there are "no drives".

You're consistently failing to stay consistent, you anthropomorphize AI although you seem to understand that you shouldn't do so.

simianwordsyesterday at 9:42 PM

> AGI has no emotions, no drive, anything

why do you say that? ever asked chatgpt about anything?

show 1 reply
b112yesterday at 6:07 PM

I think you have it, with the governing of power and such.

We don't want to rule ants, but we don't want them eating all the food, or infesting our homes.

Bad outcomes for humans, don't imply or mean malice.

(food can be any resource here)

adrianNyesterday at 6:09 PM

Why would it care to stay alive? The discussion is pretty pointless as we have no knowledge about alien intelligence and there can be no arguments based on hard facts.

show 1 reply
stackbutterflowyesterday at 6:11 PM

Tech billionaires is probably the first thing an AGI is gonna get rid of.

Minimize threats, dont rock the boat. We'll finally have our UBI utopia.