## Neural Computation

It is known that any dichotomy of {−1, 1}^{n} can be learned (separated) with a higher-order neuron (polynomial function) with 2^{n} inputs (monomials). In general, less than 2^{n} monomials are sufficient to solve a given dichotomy. In spite of the efforts to develop algorithms for finding solutions with fewer monomials, there have been relatively fewer studies investigating maximum density (II(*n*)), the minimum number of monomials that would suffice to separate an arbitrary dichotomy of {−1, 1}^{n} . This article derives a theoretical (upper) bound for this quantity, superseding previously known bounds. The main theorem here states that for any binary classification problem in {−1, 1}^{n} (n > 1), one can always find a polynomial function solution with 2^{n} −2^{n}/4 or fewer monomials. In particular, any dichotomy of {−1, 1}^{n} can be learned by a higher-order neuron with a fan-in of 2^{n} −2^{n}/4 or less. With this result, for the first time, a deterministic ratio bound independent of *n* is established as II (*n*)/2^{n} ≤ 0 75. The main theorem is constructive, so it provides a deterministic algorithm for achieving the theoretical result. The study presented provides the basic mathematical tools and forms the basis for further analyses that may have implications for neural computation mechanisms employed in the cerebral cortex.