Heterogeneous Wasserstein Discrepancy for Incomparable Distributions [article]

Mokhtar Z. Alaya, Gilles Gasso, Maxime Berar, Alain Rakotomamonjy
2021 arXiv   pre-print
Optimal Transport (OT) metrics allow for defining discrepancies between two probability measures. Wasserstein distance is for longer the celebrated OT-distance frequently-used in the literature, which seeks probability distributions to be supported on the same metric space. Because of its high computational complexity, several approximate Wasserstein distances have been proposed based on entropy regularization or on slicing, and one-dimensional Wassserstein computation. In this paper, we
more » ... a novel extension of Wasserstein distance to compare two incomparable distributions, that hinges on the idea of distributional slicing, embeddings, and on computing the closed-form Wassertein distance between the sliced distributions. We provide a theoretical analysis of this new divergence, called heterogeneous Wasserstein discrepancy (HWD), and we show that it preserves several interesting properties including rotation-invariance. We show that the embeddings involved in HWD can be efficiently learned. Finally, we provide a large set of experiments illustrating the behavior of HWD as a divergence in the context of generative modeling and in query framework.
arXiv:2106.02542v2 fatcat:7znxswqvybdnniewpfssi5ay3m