288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

December 1, 2003, Vol. 15, No. 12, Pages 2727-2778
(doi: 10.1162/089976603322518731)
© 2003 Massachusetts Institute of Technology
General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results
Article PDF (245.39 KB)

We survey and summarize the literature on the computational aspects of neural network models by presenting a detailed taxonomy of the various models according to their complexity theoretic characteristics. The criteria of classification include the architecture of the network (feedforward versus recurrent), time model (discrete versus continuous), state type (binary versus analog), weight constraints (symmetric versus asymmetric), network size (finite nets versus infinite families), and computation type (deterministic versus probabilistic), among others. The underlying results concerning the computational power and complexity issues of perceptron, radial basis function, winner-take-all, and spiking neural networks are briefly surveyed, with pointers to the relevant literature. In our survey, we focus mainly on the digital computation whose inputs and outputs are binary in nature, although their values are quite often encoded as analog neuron states. We omit the important learning issues.