> If we can't define, recognize or measure them, how exactly do we know that AI doesn't have them?
In the same way my digital thermometer doesn't have quaila. LLM's do not either. I really tire of this handwaving 'magic' concepts into LLM's.
Qualia being difficult to define and yet being such an immediate experience that we humans all know intimately and directly is quite literally the problem. Attempted definitions fall short and humans have tried and I mean really tried hard to solve this.
Please see Hard problem of consciousness https://en.wikipedia.org/wiki/Hard_problem_of_consciousness
> In the same way my digital thermometer doesn't have quaila
And I repeat the question: how do you know your thermometer doesn't? You don't, you're just declaring a fact you have no basis for knowing. That's fine if you want a job in a philosophy faculty, but it's worthless to people trying to understand AI. Again, c.f. furffle. Thermometers have that, you agree, right? Because you can't prove they don't.
The problem is that just like your digital thermometer, 50 human brain neurons in a petri dish "obviously" don't have qualia either.
So you end up either needing to draw a line somewhere between mechanical computation and qualia computation, or you can relegate it to supernatural (a soul) or grey areas (quantum magic).