Neural Computation
Fall 1991, Vol. 3, No. 3, Pages 375-385
(doi: 10.1162/neco.1991.3.3.375)
FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling
Article PDF (354.23 KB)
Abstract
A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward global-feedforward architecture. It is shown that the local-recurrent global-feedforward model performs better than the local-feedforward global-feedforward model.