Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

September 1995, Vol. 7, No. 5, Pages 1079-1104
(doi: 10.1162/neco.1995.7.5.1079)
© 1995 Massachusetts Institute of Technology
Convex Potentials and their Conjugates in Analog Mean-Field Optimization
Article PDF (1.13 MB)
Abstract

This paper deals with the problem of mapping hybrid (i.e., both discrete and continuous) constrained optimization problems onto analog networks. The saddle-point paradigm of mean-field methods in statistical physics provides a systematic procedure for finding such a mapping via the notion of effective energy. Specifically, it is shown that within this paradigm, to each closed bounded constraint set is associated a smooth convex potential function. Using the conjugate (or the Legendre-Fenchel transform) of the convex potential, the effective energy can be transformed to yield a cost function that is a natural generalization of the analog Hopfield energy. Descent dynamics and deterministic annealing can then be used to find the global minimum of the original minimization problem. When the conjugate is hard to compute explicitly, it is shown that a minimax dynamics, similar to that of Arrow and Hurwicz in Lagrangian optimization, can be used to find the saddle points of the effective energy. As an illustration of its wide applicability, the effective energy framework is used to derive Hopfield-like energy functions and descent dynamics for two classes of networks previously considered in the literature, winner-take-all networks and rotor networks, even when the cost function of the original optimization problem is not quadratic.