These LLMs don’t have senses, they have a token stream. They have no experience of the world outside of the language tokens they operate on.
I’m not sure I believe that consciousness emerges from sensory experience, but if it does, LLMs won’t get it.
Neural networks can have senses. Hook an LLM up to a thermometer and it will respond to temperature changes.
How do you know the sensation of a red photon hitting a cone cell, transduced to the optic nerve through ion junctions and processed by pyramidal neurons, is any more or less real than the excitation of electrons in a doped silicon junction activating the latent space of the "red" thought vector? Cause we are made of meat?