288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

May 1992, Vol. 4, No. 3, Pages 393-405
(doi: 10.1162/neco.1992.4.3.393)
© 1992 Massachusetts Institute of Technology
Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks
Article PDF (682.11 KB)

We show that a recurrent, second-order neural network using a real-time, forward training algorithm readily learns to infer small regular grammars from positive and negative string training samples. We present simulations that show the effect of initial conditions, training set size and order, and neural network architecture. All simulations were performed with random initial weight strengths and usually converge after approximately a hundred epochs of training. We discuss a quantization algorithm for dynamically extracting finite state automata during and after training. For a well-trained neural net, the extracted automata constitute an equivalence class of state machines that are reducible to the minimal machine of the inferred grammar. We then show through simulations that many of the neural net state machines are dynamically stable, that is, they correctly classify many long unseen strings. In addition, some of these extracted automata actually outperform the trained neural network for classification of unseen strings.