What you’re missing is a “self” to have the “experience”.
LLMs do not have a self. This is like arguing that the algorithm responsible for converting ripped YouTube music videos to MP3s has a consciousness.
> the algorithm responsible for converting ripped YouTube music videos to MP3s has a consciousness.
Can such an algorithm reason about itself in relation to others?
How do I know you have this "self"?
How do you know other humans do?
The sense of self may be an emergent property of the grammatical structure of language and the operations of memory. If an LLM, by necessity, operates with the linguistics of “you” and “me” and “others”. And documents that in a memory system and can reliably identify itself as a discrete entity from you and others then on what basis would we say it doesn’t have a sense of self?