> Converting instructions to code is essential complexity
I don't agree with that. If I want to add two numbers I'd like to write `a = b + c`. I do not want to write the machine code that effects the same result on whatever computer architecture I'm targeting. Precisely _how_ one adds two numbers is accidental complexity. Whether they need to be added, and what numbers should be added, is essential complexity.
Fortran removed that accidental complexity and left the essential stuff in place. There were no fuzzy lines.
Without a method for how to do the work you won't be able to do the work. Is that not the definition of essential?
But the way you've stated it, as long as you're pointing your microscope at one thing, that thing is "essential" and every other thing in the world is "inessential".