Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

November 1, 2002, Vol. 14, No. 11, Pages 2729-2750
(doi: 10.1162/089976602760408044)
© 2002 Massachusetts Institute of Technology
Two Timescale Analysis of the Alopex Algorithm for Optimization
Article PDF (164.91 KB)
Abstract

Alopex is a correlation-based gradient-free optimization technique useful in many learning problems. However, there are no analytical results on the asymptotic behavior of this algorithm. This article presents a new version of Alopex that can be analyzed using techniques of two timescale stochastic approximation method. It is shown that the algorithm asymptotically behaves like a gradient-descent method, though it does not need (or estimate) any gradient information. It is also shown, through simulations, that the algorithm is quite effective.