I wonder if there’s a concept akin to Shannon Entropy that dictates the level of detail a simulation can provide given a ratio of bits to something. Although presumably any level of bits could be simulated given more time.
An explanation of the observer effect may be that the universe is lazily evaluated at the moment of observation. Outside of that experienced reality, it might as well be all a cloud of latent possibilities, rough outlines and low-res details, enough for a plausible simulation.
An explanation of the observer effect may be that the universe is lazily evaluated at the moment of observation. Outside of that experienced reality, it might as well be all a cloud of latent possibilities, rough outlines and low-res details, enough for a plausible simulation.