No it's not the same thing with music latency. For one thing, music is an audio event where as UI is a visual event. We know that music and audio stimuli operate differently.
And for the music latency, you can here where the latency happens in relation to the rest of the music piece (be the rock music, techno, or whatever style of music). You have a point of reference. This makes latency less of a reaction event and more of a placement event. ie you're not just reacting to the latency, you're noticing the offset with the rest of the music. And that adds significant context to perception.
This is also ignores the point that musicians have to train themselves to hear this offset. It's like any advanced skill from a golf swing to writing code: it takes practice to get good at it.
So it's not the same. I can understand why people think it might be. But when you actually investigate this properly, you can see why DJs and musicians appear to have supernatural senses vs regular reaction times. It's because they're not actually all that equivalent.