288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

May 2019, Vol. 31, No. 5, Pages 827-848
(doi: 10.1162/neco_a_01178)
© 2019 Massachusetts Institute of Technology
Information Geometry for Regularized Optimal Transport and Barycenters of Patterns
Article PDF (553.54 KB)
We propose a new divergence on the manifold of probability distributions, building on the entropic regularization of optimal transportation problems. As Cuturi (2013) showed, regularizing the optimal transport problem with an entropic term is known to bring several computational benefits. However, because of that regularization, the resulting approximation of the optimal transport cost does not define a proper distance or divergence between probability distributions. We recently tried to introduce a family of divergences connecting the Wasserstein distance and the Kullback-Leibler divergence from an information geometry point of view (see Amari, Karakida, & Oizumi, 2018). However, that proposal was not able to retain key intuitive aspects of the Wasserstein geometry, such as translation invariance, which plays a key role when used in the more general problem of computing optimal transport barycenters. The divergence we propose in this work is able to retain such properties and admits an intuitive interpretation.