logoalt Hacker News

ahmismtoday at 12:32 PM1 replyview on HN

To the first point, I think that the KL divergence is indeed symmetric in this case, 0.4 * ln(0.4 / 0.6) + 0.6 * ln(0.6 / 0.4) no matter which direction you go.

Still, there's no avoiding the inherent asymmetry in KL divergence. To my mind, the best we can do is to say that from P's perspective, this is how weird the distribution Q looks.


Replies

cubefoxtoday at 1:45 PM

> To the first point, I think that the KL divergence is indeed symmetric in this case, 0.4 * ln(0.4 / 0.6) + 0.6 * ln(0.6 / 0.4) no matter which direction you go.

But my argument also works for any other probability distribution, e.g. P(heads)=0.5 vs Q(heads)=0.99.

> Still, there's no avoiding the inherent asymmetry in KL divergence.

I wasn't suggesting otherwise, I was talking about his interpretation.