logoalt Hacker News

New mathematical framework reshapes debate over simulation hypothesis

76 pointsby Gooblebraiyesterday at 11:21 AM97 commentsview on HN

Comments

A_D_E_P_Tyesterday at 12:05 PM

Oh man, Stephen Wolfram and Jürgen Schmidthuber are probably fuming at the fact that this is called a "new" mathematical framework. It's all very old, and quite conventional, even popular -- not exactly the road not taken.

What the author did was use the Physical Church-Turing thesis, and Kleene's second recursion theorem, to show that: (1) If a universe’s dynamics are computable (PCT), and (2) the universe can implement universal computation (RPCT), then (3) the universe can simulate itself, including the computer doing the simulating.

That's basically all. And thus "there would be two identical instances of us, both equally 'real'." (Two numerically distinct processes are empirically identical if they are indistinguishable. You might remember this sort of thing from late 20th c. philosophy coursework.)

He also uses Rice’s theorem (old) to show that there is no uniform measure over the set of "possible universes."

It's all very interesting, but it's more a review article than a "new mathematical framework." The notion of a mathematical/simulated universe is as old as Pythagoras (~550 BC), and Rice, Church-Turing, and Kleene are all approaching the 100-year mark.

show 5 replies
CuriouslyCyesterday at 12:34 PM

The simulation hypothesis takes something reasonable, that reality is "virtual," and runs it into absurdity.

If the universe isn't "real" in the materialist sense, that does not imply that there's a "real" universe outside of the one we perceive, nor does it imply that we're being "simulated" by other intelligences.

The path of minimal assumptions from reality not being "real" is idealism. We're not simulated, we're manifesting.

show 4 replies
flufluflufluffyyesterday at 10:00 PM

The whole “simulation hypothesis” thing has always irked me. To me, the question of whether our universe was [“intentionally” “created” by some other “being(s)”] vs [“naturally” happened] is meaningless. Whatever it was on the other side is way too insanely unfathomable to be classified into those 2 human-created ideas. Ugh the whole thing is so self-centered.

show 1 reply
daoboyyesterday at 1:10 PM

I always feel like these frameworks rely on a semantic sleight of hand that sounds plausible on the surface, but when you drill down a bit they render words like 'simulation' 'reality' or 'truth' as either unintelligible or trite, depending on how you define them.

show 1 reply
GistNoesisyesterday at 2:42 PM

The problem of computers is the problem of time : How to obtain a consistent causal chain !

The classical naive way of obtaining a consistent causal chain, is to put the links one after the other following the order defined by the simulation time.

The funnier question is : can it be done another way ? With the advance of generative AI, and things like diffusion model it's proven that it's possible theoretically (universal distribution approximation). It's not so much simulating a timeline, but more sampling the whole timeline while enforcing its physics-law self-consistency from both directions of the causal graph.

In toy models like game of life, we can even have recursivity of simulation : https://news.ycombinator.com/item?id=33978978 unlike section 7.3 of this paper where the computers of the lower simulations are started in ordered-time

In other toy model you can diffusion-model learn and map the chaotic distribution of all possible three-body problem trajectories.

Although sampling can be simulated, the efficient way of doing it necessitate to explore all the possible universes simultaneously like in QM (which we can do by only exploring a finite number of them while bounding the neighbor universe region according to the question we are trying to answer using the Lipschitz continuity property).

Sampling allows you to bound maximal computational usage and be sure to reach your end-time target, but at the risk of not being perfectly physically consistent. Whereas simulating present the risk of the lower simulations siphoning the computational resources and preventing the simulation time to reach its end-time target, but what you could compute is guaranteed consistent.

Sampled bottled universe are ideal for answering question like how many years must a universe have before life can emerge, while simulated bottled universe are like a box of chocolate, you never know what you are going to get.

The question being can you tell which bottle you are currently in, and which bottle would you rather get.

show 2 replies
Beijingeryesterday at 2:29 PM

Konrad Zuse was a German pioneer in computing, best known for building the Z3 in 1941—the world's first functional programmable digital computer. Later in his career, he explored profound philosophical and theoretical ideas about the nature of the universe. Rechnender Raum (literally "Computing Space" or "Calculating Space") is the title of his groundbreaking 1969 book (published in the series Schriften zur Datenverarbeitung). In it, Zuse proposed that the entire universe operates as a vast discrete computational process, akin to a giant cellular automaton. He argued that physical laws and reality itself emerge from digital, step-by-step computations on a grid of discrete "cells" in space, rather than from continuous analog processes as traditionally assumed in physics. This idea challenged the prevailing view of continuous physical laws and laid the foundation for what we now call digital physics, pancomputationalism, or the simulation hypothesis (the notion that reality might be a computation, possibly running on some underlying "computer"). Zuse's work is widely regarded as the first formal proposal of digital physics, predating similar ideas by others like Edward Fredkin or Stephen Wolfram.

EdgeCaseExistyesterday at 12:48 PM

The author of the article on the site, is the author of the paper!

show 2 replies
kazinatoryesterday at 11:50 PM

> The simulation hypothesis — the idea that our universe might be an artificial construct running on some advanced alien computer — has long captured the public imagination.

Right; that's the feeble public imagination. What captures my imagination is the idea that the existence of the rules alone is enough to obtain the universe; no simulator is required.

We can make an analogy to a constant like pi. No division has to take place of a circumference by a diameter in order to prop up the existence of pi.

The requirement for a simulator just punts the rock down the road: in what universe is that simulator, and what simulates that? It's an infinite regress. If there is no simulator, that goes away.

If certain equations dictate that you exist and have experiences, then you exist and have experiences in the same way that pi exists.

tediousgraffit1today at 4:55 AM

I Don't Know, Timmy, Being God Is a Big Responsibility https://qntm.org/responsibilit

therobots927yesterday at 4:30 PM

“ Wolpert shows that this isn’t required by the mathematics: simulations do not have to degrade, and infinite chains of simulated universes remain fully consistent within the theory.”

How is this consistent with the second law of thermodynamics? If there is one universe containing an infinite number of simulations (some of which simulate the base universe) wouldn’t there be a limit to how much computation could be contained? By its very nature a chain of simulations would grow exponentially with time, rapidly accelerating heat death. That may not require the simulations to degrade but it puts a hard limit on how many could be created.

show 1 reply
empiricusyesterday at 2:33 PM

Trying to read the paper... I guess if you ignore the difference between finite and infinite tape Turing machine, and if all physical constraints are outside the scope of the paper, then it is easy to prove the universe can simulate itself.

qingcharlestoday at 2:26 AM

Simulating a/the universe, and simulating the universe at-or-above realtime are also two separate things.

A non-realtime simulation would allow you certain solutions (such as perfectly recreating a past state of the current universe), but might not allow you to practically see a future state.

quantum_stateyesterday at 12:52 PM

Hope folks involved in this type of exploration have it clear in mind that what they are reasoning about it’s strictly the model of the real world only. It’s far from obvious that nature follows anything remotely computational.

kpgayesterday at 9:07 PM

"Example 1. ... After this you physically isolate isolate your laptop, from the rest of the Universe, and start running it..."

However there is no way "you can physically isolate isolate your laptop, from the rest of the Universe" so doesn't that refute this example (at least?)

shtzvhdxyesterday at 2:17 PM

This all assumes there's no computation beyond a Turing machine, right? Therefore, this assumes reality is a simulation on a finite set of rationals?

So, as long as one believes in continuum, this is just toying around?

show 1 reply
nrhrjrjrjtntbtyesterday at 12:30 PM

Like running Kubernetes in a Docker container.

mgaunardyesterday at 12:49 PM

It's starting with the assumption that the simulation would reproduce the universe perfectly; this eliminates a lot of possibilities.

Many would expect that the parent universe would be more sophisticated, potentially with more dimensions, that we can only glimpse through artifacts of the simulation.

show 2 replies
skeledrewyesterday at 4:43 PM

A universe is a function. It only makes sense that a function can call other functions, including itself, ad infinitum. And a function may be called in the same or a different thread.

le-markyesterday at 2:04 PM

I wonder if there’s a concept akin to Shannon Entropy that dictates the level of detail a simulation can provide given a ratio of bits to something. Although presumably any level of bits could be simulated given more time.

show 1 reply
boomskatsyesterday at 12:15 PM

Zero cost abstractions! I'd almost be interested in Bostrom's inevitable physics-based counter (if he wasn't such a racist bellend).

thegrim000yesterday at 5:35 PM

Once again, discussion around the simulation hypothesis that for some reason assumes the simulating universe has the exact same laws of physics / reality as the simulated universe. Assuming that the simulated universe can use their mathematics to describe/constrain the simulator universe. It makes no sense to me.

moi2388yesterday at 4:06 PM

Yeah right. In infinite Turing machines maybe. If it’s finite, it’s impossible to simulate something larger with the same fidelity

croesyesterday at 1:40 PM

Related?

> Consequences of Undecidability in Physics on the Theory of Everything

https://news.ycombinator.com/item?id=45770754

jonathanstrangeyesterday at 1:25 PM

Here is one thing I don't understand about these kind of approaches. Doesn't a computational simulation imply that time is discrete? If so, doesn't this have consequences for our currently best physical theories? I understand that the discreteness of time would be far below what can be measured right now but AFAIK it would still makes a difference for physical theories whether time is discrete or not. Or am I mistaken about that? There are similar concerns about space.

By the way, on a related note, I once stumbled across a paper that argued that if real numbers where physically realizable in some finite space, then that would violate the laws of thermodynamics. It sounded convincing but I also lacked the physical knowledge to evaluate that thesis.

show 1 reply
bobbyschmiddyesterday at 2:07 PM

Someone did another 'Kleene-Turing' on the whole issue with "the origin"?

bad bad not good.

morpheos137yesterday at 1:32 PM

These models get things backwards. The universe is a wave function in logic space. It appears discrete and quantized because integers composed of primes are logically stable information entropy minimal nodes. In other words the universe is the way it is because it depends on math. Math does not depend on the universe. Logic is its own "simulation." Math does not illuminate physics, rather physics illuminates math. This can be shown by the construction of a filter that cleanly sorts prime numbers from composites without trial division but by analysis of the entropic harmonics of integers. In other words what we consider integers are not fundamental but rather emergent properties of the minimal subjunctive of superposition of zero (non existence) and infinity (anything that is possible). By ringing an integer like a bell according to the template provided by the zeta function we can find primes and factor from spectral analysis without division. Just as integers emerge from the wave as stable nodes so do quanta in the physical isomorphism. In other words both integers and quanta are emergent from the underlying wave that is information in tension between the polarity of nonexistence and existence. So what appears discrete or simulated is actually an emergent phenomenon of the subjunctive potential of information constrained by the two poles of possibility.

show 1 reply
raverbashingyesterday at 1:05 PM

We can't even run docker inside docker without making things slower, the simulator hypotheses is frankly ridiculous

show 2 replies
mw67yesterday at 12:45 PM

Funny people still call that "simulation hypothesis". At some point they should try to do some Past lives regressions or Out of body experience (astral projection). Then they'll know for sure what this reality is about.

show 2 replies