To the first point, I think that the KL divergence is indeed symmetric in this case, 0.4 * ln(0.4 / 0.6) + 0.6 * ln(0.6 / 0.4) no matter which direction you go.
Still, there's no avoiding the inherent asymmetry in KL divergence. To my mind, the best we can do is to say that from P's perspective, this is how weird the distribution Q looks.
> To the first point, I think that the KL divergence is indeed symmetric in this case, 0.4 * ln(0.4 / 0.6) + 0.6 * ln(0.6 / 0.4) no matter which direction you go.
But my argument also works for any other probability distribution, e.g. P(heads)=0.5 vs Q(heads)=0.99.
> Still, there's no avoiding the inherent asymmetry in KL divergence.
I wasn't suggesting otherwise, I was talking about his interpretation.