On this blog on several posts we encountered probability divergence metrics such as the relative entropy. Intuitively, divergence metrices measure the distance between distributions on the probability simplex . For the sake of exposition, we assume here that the distributions are supported on a finite set . We list several popular such divergences in the table below

Distance | Domain | |

Kullback-Leibler I | ||

Kullback-Leibler II | ||

Pearson | ||

Hellinger | ||

Total Variation |

I always find it instructive to have a geometric understanding of what these metrics look like. Hence, I set myself the task to show here the aforementioned divergences in an interactive fashion.

Thank for your posting. May I share this?