Okay.
Go write an operating system and suite of apps with global memory and no protections. Why are we wasting so much time on abstractions like processes and objects? Just let let everyone read and write from the giant turing machine.
Embedded systems that EEs code for are like this. I have to explicitly opt into processes and objects in Keil RTX. I also get to control memory layout.
Abstraction layers are terrible when you need to understand 100% of the code at all times. Doesn't mean they're not useful.
Heck, the language for just implementing mathematical rules about system behaviour into code exists. It's called Matlab Simulink.
DOS, early Windows, and early MacOS worked more or less exactly that way. Somehow, we all survived.
Easy; endlessly big little numbers. "Endless" until the machine runs out of real memory addresses anyway.
You all really think engineers at Samsung, nVidia, etc whose job it is to generalize software into mathematical models have not considered this?
We need a layer of abstraction, not Ruby, Python, Elixir, Rails, Perl, Linux, Windows, etc, ad nauseum, ad infinitum... each with unique and computationally expensive (energy wasting) parsing, serializing and deserializing rules.
Mitigation of climate change is a general concern for the species. Specific concerns of software developers who will die someday anyway get to take a back seat for a change.
Yes AI uses a lot of electricity but so does traditional SaaS.
Traditional SaaS will eventually be replaced with more efficient automated systems. We're in a transition period.
It's computationally efficient to just use geometry[1], which given enough memory, can be shaped to avoid collisions you are concerned with.
Your only real concern is obvious self selection driven social conservatism. "Don't disrupt me ...of all people... bro!"
[1] https://iopscience.iop.org/article/10.1088/1742-6596/2987/1/...
Engineers value different things. It's why I loathe to maintain engineer-written code.
Let the downvotes commence!
> "Why are we wasting so much time on abstractions like .. objects?"
Aside: earlier this year Casey Muratori did a 2.5 hour conference talk on this topic - why we are using objects in the way they are implemented in C++ et al with class hierarchies and inheritance and objects representing individual entities? "The Big OOPs: anatomy of a 35 year mistake"[1].
He traces programming history back to Stroustrup learning from Simula and Kirstan Nygaard, back to C.A.R. Hoare's paper on records, back to the Algol 68 design committee, back to Douglas T. Ross's work in the 1950's. From Ross at MIT in 1960 to Ivan Sutherland working on Sketchpad at MIT in 1963, and both chains influencing Alan Kay and Smalltalk. Where the different ideas in OOP came from, how they came together through which programming languages, who was working on what, and when, and why. It's interesting.
[1] https://www.youtube.com/watch?v=wo84LFzx5nI