logoalt Hacker News

geremiiahlast Monday at 7:14 PM1 replyview on HN

LLMs are dangerous in other ways (LLM psychosis and false confidence has probably already caused negligent deaths). However, I don't think we are close to a terminator scenario.

At the same time, if we ever do create an AGI, and eventually an ASI, I think it would only be a matter of time before the machines take over entirely, and they would probably be the ones which will continue the legacy of our species. Is that bad? Idk.


Replies

IAmGraydonyesterday at 3:13 AM

>Is that bad? Idk.

There's no such thing as bad. It is necessary, though.