| |
Abstract:
probabilistic computational model which generalizes many
noisy neural network models, including the recent work of Maass and
Sontag. We identify weak ergodicity as the mechanism responsible
for restriction of the computational power of probabilistic models
to
definite languages,
independent of the characteristics of the noise: whether it is
discrete or analog, or if it depends on the input or not, and
independent of whether the variables are discrete or continuous. We
give examples of weakly ergodic models including noisy
computational systems with noise depending on the current state and
inputs, aggregate models, and computational systems which update in
continuous time.
|