I've been writing compilers for 45 years now. Tokenizing is a big part of every textbook on compilers. To resolve expressions (which are recursive in nature) it would have had to do more than just tokenizing. While this isn't hard at all, it's "parsing" which is also qualifying it as a compiler.
I.e. the basic program was lexing and parsing. It's a compiler. A very simple one, sure, but a compiler.
Compilers generate code in another, usually lower level language that is executed by reading all of the code that could be executed first. Interpreters (such as the BASIC interpreter we are discussing here) read only that part of the code that gets executed and typically call functions rather than that they generate code (never mind JIT). Tokenization prior to interpretation is technically an optional step (it's just an efficiency boost) and is not normally confused with compilation even if there are some superficial similarities.
You of all people should know this, come on.
Yes, but tokenization on its own is not compilation any more than whiskers are a cat just because a cat has them.
"Nobody" uses it that way, and language is defined by use.