288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

July 1, 1996, Vol. 8, No. 5, Pages 1041-1060
(doi: 10.1162/neco.1996.8.5.1041)
© 1996 Massachusetts Institute of Technology
A Novel Optimizing Network Architecture with Applications
Article PDF (984.07 KB)

We present a novel optimizing network architecture with applications in vision, learning, pattern recognition, and combinatorial optimization. This architecture is constructed by combining the following techniques: (1) deterministic annealing, (2) self-amplification, (3) algebraic transformations, (4) clocked objectives, and (5) softassign. Deterministic annealing in conjunction with self-amplification avoids poor local minima and ensures that a vertex of the hypercube is reached. Algebraic transformations and clocked objectives help partition the relaxation into distinct phases. The problems considered have doubly stochastic matrix constraints or minor variations thereof. We introduce a new technique, softassign, which is used to satisfy this constraint. Experimental results on different problems are presented and discussed.