## Neural Computation

October 2016, Vol. 28, No. 10, Pages 2045-2062
(doi: 10.1162/NECO_a_00878)
© 2016 Massachusetts Institute of Technology
A Note on Divergences
Article PDF (132.38 KB)
Abstract

In many areas of neural computation, like learning, optimization, estimation, and inference, suitable divergences play a key role. In this note, we study the conjecture presented by Amari (2009) and find a counterexample to show that the conjecture does not hold generally. Moreover, we investigate two classes of -divergence (Zhang, 2004), weighted f-divergence and weighted -divergence, and prove that if a divergence is a weighted f-divergence, as well as a Bregman divergence, then it is a weighted -divergence. This result reduces in form to the main theorem established by Amari (2009) when .