logoalt Hacker News

myrmidonyesterday at 5:36 PM1 replyview on HN

I'm not suggesting to pursue AGI via Excel, this is just a hypothetical for a reason. The technical feasibility of this (low) does not really matter, but if you want to base your argument on it you are basically playing the "god of the gaps" game, which is a weak/bad position IMO.

My point is that dismissing possible machine consciousness as "it's just a spreadsheet/statistics/linear algebra" is missing a critical step: Those dismissals don't demonstrate that human consciousness is anything more than an emergent property achievable by linear algebra.

If you want human minds to be "unsimulatable", then you need some essential core logic that can not be simulated on a turing machine and physics is not helping with that.

> I've done that evaluation with LLMs and they're definitely not conscious.

What is your definition for "consciousness" here? Are you confident that you are not gatekeeping current machine intelligence by demanding somewhat arbitrary capabilities in your definition of consciousness that are somewhat unimportant? E.g. memory or online learning; if a human was unable to form long-term memories or learn anything new, could you confidently call him "non-conscious" as well?


Replies

miyojiyesterday at 6:13 PM

I'm not dismissing possible machine consciousness. I'm saying that no current machines have consciousness.

> If you want human minds to be "unsimulatable", then you need some essential core logic that can not be simulated on a turing machine and physics is not helping with that.

You don't have a proof of possibility either, you have no idea how a brain works and you're just postulating that in principle a computer can do the same thing. Okay, in principle, I agree. What about in practice?

> Are you confident that you are not gatekeeping current machine intelligence by demanding somewhat arbitrary capabilities in your definition of consciousness that are somewhat unimportant?

Yes, I'm quite sure. Are you trying to argue that current LLMs have consciousness?