I'm fairly sure we can measure human "sensation" as in detect physiological activity in the body in someone who is under anesthesia yet the body reacts in different ways to touch or pain.
The "feelings" part is probably harder though.
You can measure model activity even better.
How do you know that model processing text or image input doesn't go through feeling of confusion or excitement or corrupted image doesn't "smell" right for it?
Just the fact that you can pause and restart it doesn't mean it doesn't emerge.
We can measure the physiological activity, but not whether it gives rise to the same sensations that we experience ourselves. We can reasonably project and guess that they are the same, but we can not know.
In practical terms it does not matter - it is reasonable for us to act as if others do experience the same we do. But if we are to talk about the nature of conscience and sentience it does matter that the only basis we have for knowing about other sentient beings is their self-reported experience.