logoalt Hacker News

xyzsparetimexyztoday at 12:58 AM2 repliesview on HN

One thing I haven't seen brought up much is that LLMs are basically stateless. To be conscious requires the ability for internal state to change. The weights dont change at all, but the rng seed and input/output text do. We're not seriously arguing that the text itself is the conscious part are we?


Replies

Serenaculatoday at 1:59 AM

Why exactly should consciousness require the ability for internal state to change? That seems like a fairly arbitrary requirement to me.

Even if we allow it, from a certain perspective it does change, otherwise each token output would be identical. They are not.

show 2 replies
garciasntoday at 1:03 AM

LLMs are stateless for recent interactions, but do have long-term memory from their training and thus act very much like someone suffering from Alzheimer’s.

So, folks who suffer from some level of brain damage that causes them not to have short term memory are then not conscious?

I’m not arguing that LLMs are conscious, mind you; I just disagree that short-term memory loss outside of their context window should be the line.

E: double negatives are bad; my 8th grade English teacher would be disappointed.

show 1 reply