288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

May 15, 1999, Vol. 11, No. 4, Pages 977-993
(doi: 10.1162/089976699300016548)
© 1999 Massachusetts Institute of Technology
Pruning Using Parameter and Neuronal Metrics
Article PDF (196.01 KB)

In this article, we introduce a measure of optimality for architecture selection algorithms for neural networks: the distance from the original network to the new network in a metric defined by the probability distributions of all possible networks. We derive two pruning algorithms, one based on a metric in parameter space and the other based on a metric in neuron space, which are closely related to well-known architecture selection algorithms, such as GOBS. Our framework extends the theoretically range of validity of GOBS and therefore can explain results observed in previous experiments. In addition, we give some computational improvements for these algorithms.