logoalt Hacker News

stevenhuangyesterday at 10:34 PM1 replyview on HN

We may not need to go down that level.

For the qualities we care about, it may turn out to be the case we don't need to simulate matter perfectly. We may not need to concern ourselves with the fractal complexity of reality if we identify the right higher level abstractions with which to operate on. This phenomenon is known as causal emergence.

> That is, a macroscale description of a system (a map) can be more informative than a fully detailed microscale description of the system (the territory). This has been called “causal emergence.”

https://www.mdpi.com/1099-4300/19/5/188

From a HN discussion a while ago:

https://www.quantamagazine.org/the-new-math-of-how-large-sca...

> A highly compressed description of the system then emerges at the macro level that captures those dynamics of the micro level that matter to the macroscale behavior — filtered, as it were, through the nested web of intermediate ε-machines. In that case, the behavior of the macro level can be predicted as fully as possible using only macroscale information — there is no need to refer to finer-scale information. It is, in other words, fully emergent. The key characteristic of this emergence, the researchers say, is this hierarchical structure of “strongly lumpable causal states.”


Replies

cess11today at 11:22 AM

Who are "we", and why would I care about them here?

There are situations where approximations are good enough for simulations, sure, but that's not the subject here.

I reject the idea that chatbots have feelings or intellect because they output text that is similar to what a human might write in some hypothetical situation or other. To the extent that they can have those properties, it is to the same extent as Clark Kent can, if one were to accept such a conflatory and confused discourse.