288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

January 2012, Vol. 24, No. 1, Pages 104-133
(doi: 10.1162/NECO_a_00200)
© 2011 Massachusetts Institute of Technology
Recurrent Kernel Machines: Computing with Infinite Echo State Networks
Article PDF (577.79 KB)

Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks.