## Neural Computation

This article clarifies the relation between the learning curve and the algebraic geometrical structure of a nonidentifiable learning machine such as a multilayer neural network whose true parameter set is an analytic set with singular points. By using a concept in algebraic analysis, we rigorously prove that the Bayesian stochastic complexity or the free energy is asymptotically equal to λ_{1} log*n* − (*m*_{1} − 1) loglog*n* + constant, where *n* is the number of training samples and λ_{1} and *m*_{1} are the rational number and the natural number, which are determined as the birational invariant values of the singularities in the parameter space. Also we show an algorithm to calculate λ_{1} and *m*_{1} based on the resolution of singularities in algebraic geometry. In regular statistical models, 2λ_{1} is equal to the number of parameters and *m*_{1} = 1, whereas in nonregular models, such as multilayer networks, 2λ_{1} is not larger than the number of parameters and *m*_{1} ≥ 1. Since the increase of the stochastic complexity is equal to the learning curve or the generalization error, the nonidentifiable learning machines are better models than the regular ones if Bayesian ensemble learning is applied.