logoalt Hacker News

smokedetector110/01/20243 repliesview on HN

How does this explain away the mystery of consciousness?


Replies

visarga10/01/2024

We have a dual search loop - outside, we act using our experience to gain new data. Inside, we compress this data and update our experience. We search for experience and search for understanding. Acting is search for new insights, learning is search for error minimization.

I think the way we encode our experiences is relationally, like neural networks. We relate new experiences against past experiences, this creates a semantic space that is highly dimensional. Any concept is a point or a region in this space. It has consistent semantics, which leads to the unified experience. We can relate anything to anything in this space without having a central understander. Encoding your own experiences creates a first person perspective from 3rd person data, which was always a "hard" problem to explain in philosophy.

The serial action bottleneck adds to the illusion of centralization. But it's still a distributed process, no neuron is conscious or understands by itself. And even in society, no human can recreate even a 1% of human culture individually. We are not that smart on our own. We should always look for the larger context where we develop, not just the brain.

Search has the virtue of not hiding the environment, it is social and distributed, unlike more personal concepts like consciousness, intelligence and understanding. But as I said above, even inside the brain there is nothing but distributed processing, no homunculus.

I think the core of my argument is "there is no centralized consciousness, understanding or intelligence, they are distributed processes, they act across neurons in the brain and across people in society". It seems like a hard pill to swallow, if that is true then there is also no centralized understanding or truth.

show 1 reply
patcon10/01/2024

One take: Chasing the Rainbow: The Non-conscious Nature of Being (2017) https://www.frontiersin.org/articles/10.3389/fpsyg.2017.0192...

You might get a kick out of this paper (though some may find it's proposal a bit bleak, I think there's a way to integrate it without losing any of the sense of wonder of the experience of being alive :) )

It analogizes conscious experience to the a rainbow "which accompanies physical processes in the atmosphere but exerts no influence over them".

> Though it is an end-product created by non-conscious executive systems, the personal narrative serves the powerful evolutionary function of enabling individuals to communicate (externally broadcast) the contents of internal broadcasting. This in turn allows recipients to generate potentially adaptive strategies, such as predicting the behavior of others and underlies the development of social and cultural structures, that promote species survival. Consequently, it is the capacity to communicate to others the contents of the personal narrative that confers an evolutionary advantage—not the experience of consciousness (personal awareness) itself.

show 1 reply
08234987234987210/01/2024

Consciousness is just the result of a search for "a more or less linear story that makes sense of the way I act and react"

We're good at predicting states of minds of others (helpful when trying to exploit limited resources, and very helpful for either predator or prey), and we can cheaply gather a lot of data on ourselves, so why should the capability for inferring states of mind not, as a side effect, also provide us with our own inferred state of mind: our own "I"?

show 1 reply