logoalt Hacker News

keiferskiyesterday at 6:49 AM1 replyview on HN

No, by continuity I mean literally moment to moment. Sorry if I didn’t clarify that. Even people with amnesia are still present moment to moment. As far as I know there are no things that we call conscious which have zero continuity.

I think consciousness is not an abstract property in the world, therefore it’s tied to certain types of entities. Therefore an AI is not going to be “conscious” in the way an animal is, and never will be. This is a failing of specific language. Maybe the machines can be aware, input data, mimic what we see as consciousness, etc. but the metaphor of consciousness really doesn’t fit. A jet can move faster than an eagle but it’s not moving in the same way. We simply lack a sophisticated enough language to easily differentiate the two.


Replies

yes_manyesterday at 6:57 AM

Doesn’t the LLM experience discrete continuity every time it infers the next token?

> I think consciousness is not an abstract property in the world, therefore it’s tied to certain types of entities. Therefore an AI is not going to be “conscious”

This pretty much sums up most arguments for why LLMs aren’t conscious: ”I think” followed by assertions. Only real argument is: science doesn’t quantify consciousness, we cannot quantify consciousness, let’s not assign so much certainty to models clearly exhibiting intelligence not being conscious in some way, to some degree.

show 1 reply