Probability Divergences with Shiny

On this blog on several posts we encountered probability divergence metrics such as the relative entropy. Intuitively, divergence metrices measure the distance between distributions on the probability simplex \mc P. For the sake of exposition, we assume here that the distributions are supported on a finite set \Xi. We list several popular such divergences in the table below

Distance D(\mb P, \mb P') Domain
Kullback-Leibler I \sum_{i\in \Xi} \mb P'(i) \log \left(\frac{\mb P(i)}{\mb P'(i)}\right) \mb P \gg \mb P'
Kullback-Leibler II \sum_{i\in \Xi} \mb P(i) \log \left(\frac{\mb P'(i)}{\mb P(i)}\right) \mb P' \gg \mb P
Pearson \sum_{i\in \Xi} \left(\mb P(i) - \mb P'(i)\right)^2/{\mb P'(i)} \mb P' \gg \mb P
Hellinger \sqrt{\sum_{i\in \Xi} (\sqrt{\mb P(i)} - \sqrt{\mb P'(i)})^2/2} \mb P' \gg \mb P
Total Variation \sqrt{\sum_{i\in \Xi} \abs{\mb P(i) - \mb P'(i)} \mb P' \gg \mb P

I always find it instructive to have a geometric understanding of what these metrics look like. Hence, I set myself the task to show here the aforementioned divergences in an interactive fashion.

One thought on “Probability Divergences with Shiny”

Leave a Reply

Your email address will not be published. Required fields are marked *