logoalt Hacker News

vidarhlast Monday at 6:26 PM3 repliesview on HN

> or possessed of subjective experience in any measurable way

We don't know how to measure subjective experience in other people, even, other than via self-reporting, so this is a meaningless statement. Of course we don't know whether they are, and of course we can't measure it.

I also don't know for sure whether or not you are "possessed of subjective experience" as I can't measure it.

> What they are not is conscious

And this is equally meaningless without your definition of "conscious".

> It's possible those latter two parts can be solved, or approximated, by an LLM, but I am skeptical.

Unless we can find indications that humans can exceed the Turing computable - something we as of yet have no indication is even theoretically possible - there is no rational reason to think it can't.


Replies

ivraatiemslast Monday at 6:35 PM

> Unless we can find indications that humans can exceed the Turing computable - something we as of yet have no indication is even theoretically possible - there is no rational reason to think it can't.

But doesn't this rely on the same thing you suggest we don't have, which is a working and definable definition of consciousness?

I think a lot of the 'well, we can't define consciousness so we don't know what it is so it's worthless to think about' argument - not only from you but from others - is hiding the ball. The heuristic, human consideration of whether something is conscious is an okay approximation so long as we avoid the trap of 'well, it has natural language, so it must be conscious.'

There's a huge challenge in the way LLMs can seem like they are speaking out of intellect and not just pattern predicting, but there's very little meaningful argument that they are actually thinking in any way similarly to what you or I do in writing these comments. The fact that we don't have a perfect, rigorous definition, and tend to rely on 'I know it when I see it,' does not mean LLMs do have it or that it will be trivial to get to them.

All that is to say that when you say:

> I also don't know for sure whether or not you are "possessed of subjective experience" as I can't measure it.

"Knowing for sure" is not required. A reasonable suspicion one way or the other based on experience is a good place to start. I also identified two specific things LLMs don't do - they are not self-motivated or goal-directed without prompting, and there is no evidence they possess a sense of self, even with the challenge of lack of definition that we face.

show 2 replies
prmphlast Monday at 6:36 PM

> I also don't know for sure whether or not you are "possessed of subjective experience" as I can't measure it.

Then why make an argument based on what you do not know?

show 1 reply
nprateemlast Monday at 7:19 PM

Anyone who believes an algorithm could be conscious needs to take mushrooms.

show 3 replies