Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

Fall 1989, Vol. 1, No. 3, Pages 412-423
(doi: 10.1162/neco.1989.1.3.412)
© 1989 Massachusetts Institute of Technology
Finding Minimum Entropy Codes
Article PDF (528.7 KB)
Abstract

To determine whether a particular sensory event is a reliable predictor of reward or punishment it is necessary to know the prior probability of that event. If the variables of a sensory representation normally occur independently of each other, then it is possible to derive the prior probability of any logical function of the variables from the prior probabilities of the individual variables, without any additional knowledge; hence such a representation enormously enlarges the scope of definable events that can be searched for reliable predictors. Finding a Minimum Entropy Code is a possible method of forming such a representation, and methods for doing this are explored in this paper. The main results are (1) to show how to find such a code when the probabilities of the input states form a geometric progression, as is shown to be nearly true for keyboard characters in normal text; (2) to show how a Minimum Entropy Code can be approximated by repeatedly recoding pairs, triples, etc. of an original 7-bit code for keyboard characters; (3) to prove that in some cases enlarging the capacity of the output channel can lower the entropy.