LLMs are stateless for recent interactions, but do have long-term memory from their training and thus act very much like someone suffering from Alzheimer’s.
So, folks who suffer from some level of brain damage that causes them not to have short term memory are then not conscious?
I’m not arguing that LLMs are conscious, mind you; I just disagree that short-term memory loss outside of their context window should be the line.
E: double negatives are bad; my 8th grade English teacher would be disappointed.
> do have long-term memory from their training and thus act very much like someone suffering from Alzheimer’s.
Your 8th grade science teacher may be disappointed too. Drawing such analogies using unequivocal language "very much like" disregards the limited understanding of LLMs, the false analogies between computer and biological systems, and the complex nature of Alzheimer's disease (no it is not just short term memory loss, not even close, for example ability to interpret images)