> necessary to compile
Um, no? your experience is probably at least two decades after the time period in question.. The more advanced versions of, for example, the TRS-80 BASIC (part of this "microcomputer BASICs that all share a common set of bugs") did no more than tokenize - so, `10 PRINT "Hello"` would have a binary representation for the line number, a single byte token for PRINT, then " H E L L O " and an end-of-line marker. Actually interpreting the code involved just reading it linearly; GOTO linenumber involved scanning the entire code in memory for that line number (and yes, people really did optimize things by putting GOTO and GOSUB targets earlier in the program so the interpreter would find them faster :-)
Tokenizing it and interpreting the token stream is still a compilation process. Even if it re-tokenized it each time it executed a line.
I was going to post this, but you beat me to it.
It's a VM of a sort, and the p-code the VM executes is tokenized input.