| |
Abstract:
Recently researchers have derived formal complexity analysis
of analog computation in the setting of discrete-time dynamical
systems. Although these results are theoretically instructive, they
do not identify relevant issues for implementation in actual
systems, or for psychological models of sequence processing. As an
empirical contrast, training recurrent neural networks (RNNs)
produces self-organized systems that can give us complementary and
constructive evidence for realizations of analog mechanisms.
Previous work showed that a RNN can learn to process a simple
context-free language (CFL). Herein, we extend that work to show
that a RNN can learn a harder CFL by organizing its resources into
a symbol-sensitive counting solution, and we provide a dynamical
systems analysis which demonstrates how the network can not only
count, but also copy and store counting information.
|