Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

October 2015, Vol. 27, No. 10, Pages 2097-2106
(doi: 10.1162/NECO_a_00775)
© 2015 Massachusetts Institute of Technology
A Note on Entropy Estimation
Article PDF (201.75 KB)
Abstract

We compare an entropy estimator Ĥz recently discussed by Zhang (2012) with two estimators, Ĥ1 and Ĥ2, introduced by Grassberger (2003) and Schürmann (2004). We prove the identity ĤzĤ1, which has not been taken into account by Zhang (2012). Then we prove that the systematic error (bias) of Ĥ1 is less than or equal to the bias of the ordinary likelihood (or plug-in) estimator of entropy. Finally, by numerical simulation, we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator Ĥ2 has a significantly smaller statistical error than Ĥz.