I think the gain is very little. Almost every English word is on token, the same with programming language keywords. So you're just replacing one keyword with another. The only gain in the example given is > instead of jsonify() which would be ~4 tokens.
Please check your idea agains tiktokenizer
I've checked and you get 36->30 tokens decreasal but no human readability. sounds like a poor trade