Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

July 2011, Vol. 23, No. 7, Pages 1862-1898
(doi: 10.1162/NECO_a_00144)
© 2011 Massachusetts Institute of Technology
A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
Article PDF (1004.91 KB)
Abstract

For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano’s inequality where the continuous random variable is discretized.