288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

April 1, 1996, Vol. 8, No. 3, Pages 643-674
(doi: 10.1162/neco.1996.8.3.643)
© 1996 Massachusetts Institute of Technology
The Effects of Adding Noise During Backpropagation Training on a Generalization Performance
Article PDF (1.47 MB)

We study the effects of adding noise to the inputs, outputs, weight connections, and weight changes of multilayer feedforward neural networks during backpropagation training. We rigorously derive and analyze the objective functions that are minimized by the noise-affected training processes. We show that input noise and weight noise encourage the neural-network output to be a smooth function of the input or its weights, respectively. In the weak-noise limit, noise added to the output of the neural networks only changes the objective function by a constant. Hence, it cannot improve generalization. Input noise introduces penalty terms in the objective function that are related to, but distinct from, those found in the regularization approaches. Simulations have been performed on a regression and a classification problem to further substantiate our analysis. Input noise is found to be effective in improving the generalization performance for both problems. However, weight noise is found to be effective in improving the generalization performance only for the classification problem. Other forms of noise have practically no effect on generalization.