288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

August 2010, Vol. 22, No. 8, Pages 2137-2160
(doi: 10.1162/NECO_a_00004-Zhou)
The author of this article expressly waives copyright and dedicates this article to the public domain
Competitive Layer Model of Discrete-Time Recurrent Neural Networks with LT Neurons
Article PDF (673.88 KB)

This letter discusses the competitive layer model (CLM) for a class of discrete-time recurrent neural networks with linear threshold (LT) neurons. It first addresses the boundedness, global attractivity, and complete stability of the networks. Two theorems are then presented for the networks to have CLM property. We also present the analysis for network dynamics, which performs a column winner-take-all behavior and grouping selection among different layers. Furthermore, we propose a novel synchronous CLM iteration method, which has similar performance and storage allocation but faster convergence compared with the previous asynchronous CLM iteration method (Wersing, Steil, & Ritter, 2001). Examples and simulation results are used to illustrate the developed theory, the comparison between two CLM iteration methods, and the application in image segmentation.