It's clear that the symbols want to have one meaning each, for monadic and dyadic use, but that might mean quite different execution and types.
For example, & is monadic 'where' and dyadic 'min' (a logical extension of it being AND on bit-booleans), but this means you get different semantics, even if they all capture the 'where'-ness:
1 3 ~ &0 1 0 1 / when applied to a list, gives the indices of true elements
`b`d ~ &`a`b`c`d!0 1 0 1 / when applied to a dict, gives their keys
In both cases, you get that `x@&x` works, as `&x` will yield appropriate indices for `x`, but what that actually does has changed. In other languages, these would be spelled very differently, and so do seem like an overload, but sometimes it is just a point of view.As for why it's obvious- it's not, really, but it's no less obvious than the word `where`, and you have already learnt it, as it is (as it seems to me at least) to be punned on the C syntax (same as `*`, which gives `first`).