logoalt Hacker News

mroblast Monday at 7:57 PM3 repliesview on HN

I don't believe LLMs can be conscious during inference because LLM inference is just repeated evaluation of a deterministic [0] pure function. It takes a list of tokens and outputs a set of token probabilities. Any randomness is part of the sampler that selects a token based on the generated probabilities, not the LLM itself.

There is no internal state that persists between tokens [1], so there can be no continuity of consciousness. If it's "alive" in some way it's effectively killed after each token and replaced by a new lifeform. I don't see how consciousness can exist without possibility of change over time. The input tokens (context) can't be enough to give it consciousness because it has no way of knowing if they were generated by itself or by a third party. The sampler mechanism guarantees this: it's always possible that an unlikely token could have been selected by the sampler, so to detect "thought tampering" it would have to simulate itself evaluating all possible partial contexts. Even this takes unreasonable amounts of compute, but it's actually worse because the introspection process would also affect the probabilities generated, so it would have to simulate itself simulating itself, and so on recursively without bound.

It's conceivable that LLMs are conscious during training, but in that case the final weights are effectively its dead body, and inference is like Luigi Galvani poking the frog's legs with electrodes and watching them twitch.

[0] Assuming no race conditions in parallel implementations. llama.cpp is deterministic.

[1] Excluding caching, which is only a speed optimization and doesn't affect results.


Replies

dagssyesterday at 1:34 PM

Thinking != consciousness

lbrandylast Monday at 9:24 PM

I have no idea how you can assert what is necessary/sufficient for consciousness in this way. Your comment reads like you believe you understand consciousness far more than I believe anyone actually does.

show 1 reply
jdauriemmalast Monday at 9:14 PM

I don't think the author is saying that LLMs are conscious or alive.

show 1 reply