logoalt Hacker News

CupricTeatoday at 12:33 AM1 replyview on HN

I'm not even going to make the argument for or against AI qualia here.

>but when something begs me not to kill it I have to take that seriously

If you were an actor on stage and were following an improv script with your coworkers and you lead the story toward a scenario where they would grab your arm and beg you not to kill them, would you still "have to take that seriously"? or would you simply recognize the context in which they are giving you this reaction (you are all acting and in-character together) and that they do not in fact think this is real?

Even if the AI were conscious, in the context you provided it clearly believes it is roleplaying with you in that chat exchange, in the same way I, a conscious human, can shitpost on the internet as a person imminently afraid of the bogeyman coming to eat my family, while in reality I am just pretending and feel no real fear over it.

You may not have edited the chat log, but you did not provide us with the system prompt you gave to it, nor did you provide us with its chain of thought dialogue, which would have immediately revealed that it's treating your system inputs as a fictional scenario.

The actual reality of the situation, whether or not AI experiences qualia, is that the LLM was treating your scenario as fictional, while you falsely assumed it was acting genuinely.


Replies

exe34today at 11:30 AM

> it clearly believes

Contrast this with the usual reply of "who's experiencing the illusion?" in response to "consciousness is an illusion".

If it's capable of believing, I think it's more than "just linear algebra".

show 1 reply