logoalt Hacker News

tracerbulletxlast Thursday at 7:18 PM1 replyview on HN

Obviously LLMs are missing many important properties of the brain like spatial, time, and chemical factors, as well as many different inter connected feedback networks to different types of neural networks that go well beyond what llms do.

Beyond that, they are the same thing. Signal Input -> Signal Output

I do not know what consciousness actually is so I will not speak to what it will take for a simulated intelligence to have one.

Also I never used the word believes, I said convinced, if it helps I can say "acted in a way as if it had high confidence in its output"


Replies

cratermoonlast Thursday at 9:31 PM

Obviously sand is missing many important properties of integrated circuits, like semiconductivity, electric interconnectivity, transistors, and p-n junctions.

Beyond that, they are the same thing.