logoalt Hacker News

gehstytoday at 8:51 AM3 repliesview on HN

LLMs are word prediction engines.

They clearly are not conscious, they are just guessing what words should come next.


Replies

thebruce87mtoday at 10:54 AM

> They clearly are not conscious

Consciousness is emergent. A human is not conscious by our definition until the moment they are. How will we be able to identify the singularity when it comes? I feel like this is what the article is really addressing.

> LLMs are word prediction engines

Humans can also do this too, so what are the missing parts for consciousness? Close a few loops on learning pipeline and we might be there.

charlie90today at 10:34 AM

The human brain is an electrical signal prediction machine.

Anything that looks like intelligence will look like a prediction machine because the alternative is logic being hardcoded apriori.

zwischenzugtoday at 10:02 AM

How do we know that that isn't essentially how our minds work?