I taught myself to program on an 8-bit BBC micro-computer in the mid-80s by typing in BASIC listings. I understood BASIC quite well, and could write my own structured BASIC programs, but machine code was always a bit out-of-reach. I would try to read books that started by demonstrating how to add, subtract etc, but I couldn’t see how that could build up to more complicated stuff that I could do in BASIC, like polling for input, or playing sounds, or drawing characters on the screen. Only once I got an advanced users guide and discovered the operating system commands, then it started to click with me - the complicated stuff was just arranging all the right data in the right bits of memory or registers, then (essentially) calling a particular OS command and saying ‘here’s the data you want’.
Yeah the issue is that the pedagogy doesn’t make it clear how to bridge the “calculator” with the OS stuff. I had this issue when I was a kid. How does adding eventually make something draw on the screen? Of course, it doesn’t, you need some hardware or OS specific information