| |
Abstract:
We generalize a recent formalism to describe the dynamics of
supervised learning in layered neural networks, in the regime where
data recycling is inevitable, to the case of noisy teachers. Our
theory generates predictions for the evolution in time of training-
and generalization errors, and extends the class of mathematically
solvable learning processes in large neural networks to those
complicated situations where overfitting occurs.
|