logoalt Hacker News

knollimaryesterday at 7:36 PM2 repliesview on HN

If they're entirely physical, what's the argument that multimodal models don't have them? Is it continuity of experience? Do they not encode their input into something that has a latent space? What makes this differ from experience?


Replies

soulofmischiefyesterday at 7:51 PM

They can be physical, but I'm not claiming to know definitively. The lines are extremely blurry, and I'll agree that current models have at least some of the necessary components for qualia, but again lack a sensory feedback loop. In another comment [0] I quote myself as saying:

  As an independent organism, my system is a culmination of a great deal many different kinds of kins, which can usually be broken down into simple rules, such as the activation potential of a neuron in my brain being a straight-forward non-linear response to the amount of voltage it is receiving from other neurons, as well as non-kins, such as a protein "walking" across a cell, a.k.a continuously "falling" into the lowest energy state. Thus I do not gain any conscious perception from such proteins, but I do gain it from the total network effect of all my brain's neuronal structures making simple calculations based on sensory input.
which attempts to address why physically-based qualia doesn't invoke panpsychism.

[0] https://news.ycombinator.com/item?id=46109999

FrustratedMonkyyesterday at 8:05 PM

I do think AI will have them. Nothing says they can't. And we'll have just as hard a time defining it as we do with humans, and we'll argue how to measure it, and if it is real, just like with humans.

I don't know if LLM's will. But there are lots of AI models, and when someone puts them on a continuous learning loop with goals, will be hard to argue they aren't experiencing something.