logoalt Hacker News

knollimaryesterday at 7:32 PM1 replyview on HN

How is it measured?

Can someone who's never seen red hallucinate something and assume it to be red? What if that red is correctly the red they would see if they saw red?

Can you reproduce this feeling in someone by doing something to their physical body without showing them red?

If so, how does it differ from the latent encoding for uploading an all red pdf to your favorite multi modal model?

Instead of doing that socratic bs you see a lot here, I'll be more direct:

Until there's some useful lines that can be drawn to predict things, I won't accept using a fuzzy concept to make statements about classification as it's an ever shifting goalpost.

There are answers to my legitimate above questions that would make me consider qualia useful, but when I first learned about them, they seemed fuzzy to the point of being empirically not useful. It seems like a secular attempt at a soul.

Now, obviously if you're trying to describe something with experience, it needs some actual memory and processing sensory input. Current Generative AI doesnt have a continuity of experience that would imply whatever qualia could mean, but I find it hard to definitely say that their encodings for image related stuff isn't qualia if we don't have hard lines for what qualia are


Replies

FrustratedMonkyyesterday at 7:42 PM

I can feel an object and say 'its hot' on a scale of 1-10. The temperature is known. And I can do that multiple times, with some 1-10 scale, to get a sample. Then do that with multiple people.

You can then get a distribution of what people think is 'hot' versus 'cold'. What is icy, versus, bearable.

When you go to a doctors office and they ask you on a scale to rate pain, do you think that is completely bogus?

It isn't exact, but you can correlate between people. Yes, red heads feel more pain, there are outliers.

But a far cry from metaphysical.

The problem here is the word 'qualia'. Its just too fuzzy a term.