I strongly dislike this choice of using all the symbols that do not exist on normal keyboards. I can't stand this, seems very attention seeking. Why not choose normal thingies that can be typed using the main interface we have with computers? This makes me mad, even.
But programs written in K are so beautiful and terse they are unlike anything else I've seen. It feels like there is something about it we can't really comprehend, like this beauty could not be achieve by accident, like there is something fundamentally right there...like there is some fundamental truth here. And maybe this is true about APL also.
> Why not choose normal thingies that can be typed using the main interface we have with computers?
Iverson answered this in his Turing Award acceptance lecture, which is literally linked in OP's article: https://www.eecg.utoronto.ca/~jzhu/csc326/readings/iverson.p...
You're free to disagree with him, but you need not wonder why!
APL predates ASCII by a couple years.
It originally wasn't even intended as a software language, but rather a uniform mathematical notation in the style of curry's combinators, but more practical for describing non trivial algorithms.
So he was in an era where the expectation was if you were typesetting a mathematical monograph you'd already be doing stuff like swapping the balls on your IBM typewriter for math symbols.
It's not a choice you'd make today obviously, but it was entirely reasonable then.
As for why it persists, simple answer is APL fans like it that way. It's trivial to translate to some ascii text representation. I think anyone strongly motivated to do that just switched to j, k, or if even those are two weird goes to numpy or such.