A note on the unique properties of the Kullback–Leibler divergence for sampling via gradient flows

arXiv:2507.04330v2 Announce Type: replace-cross Abstract: We consider the problem of sampling from a probability distribution $\pi$ which admits a density w.r.t. a dominating measure. It is well known that this can be written as an optimisation problem over the space of probability distributions in which we aim to minimise a divergence from $\pi$. The optimisation problem is normally solved through gradient flows in the space of probability distributions with an appropriate metric. We show that the Kullback--Leibler divergence is the only divergence in the family of Bregman divergences whose gradient flow w.r.t. many popular metrics does not require knowledge of the normalising constant of $\pi$.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top