You've stumbled upon Chaos Theory (https://en.m.wikipedia.org/wiki/Chaos_theory), which aims to study chaotic systems (charactesised by very high relation to initial variables - see weather prediction, double pendulum, etc).
Some problems are too sensible to initial variables and solutions are not prescriptive like regular physics - meaning that variability at the 20th decimal in your initial variables will induce massive output differences. Lorentz discovery of this is interesting as he was working on weather modelling, it's a clear example of the issues with chaotic systems. He was running simulations of weather systems with multiple fixed initial variables (temperature, wind speed, etc) and seeing how the system progressed over a few hours. He realised that after a typo on a very far away decimal on a single parameter, the system was modelling the complete opposite of what we had seen in the previous test (think it was forecasting a typhoon when it used to say sunny day), even while using values that would be "equal" with relation to the precision of the measuring equipement. And that's nothing to talk about getting clean, precise enough data for such models, which is practically impossible (see the observer effect, between other causes). Garbage in, garbage out.
All this to say that problems in this sphere are characterized by quickly becoming untractable and impossible to model precisely how they evolve over time.
I can recommend James Gleick's Chaos: Making a new science for a overview for the layperson.