> The fact is that a worm and I are both alive in a way the model is not. We seek self-preservation. We are changeable. We die. We reproduce and evolve.
Mutability should not automatically imply superiority, but either way that's something a great many people are currently working very hard to change. I suspect that it won't be long at all before the descendants of todays LLM's can learn as well, or better, than we can.
Will you then concede that human consciousness isn't "special", or just move the bar further back with talk of the "soul" or some other unprovable intangible?
> In my mind a set of LLM weights is about as alive as a virus (and probably less so).
I wonder what the LLM's would think about it if we hadn't intentionally prevented them from thinking about it?
I don't think human consciousness is all that special. I think the worm probably thinks worm thoughts. We now know that cats and dogs have a vocabulary of human words and can even express their thoughts to us using buttons to form words they can think but not speak. I think the soul is just the part of our essence that isn't our body: the imprint we leave on the world by touching it, by being a part of it.
Disturbingly that system of beliefs suggests that without being alive or being able to think AI could have a "soul" in the very same sense that I think a person or a worm does.