Subscriber :
Stanford University Libraries
»
LOG IN
Home
Library
Journals
Books
Reference Works
Conference Materials
What's New
News
Jobs
Calls For Papers
Departments
OpenCourseWare
The Brain Sciences at MIT
Multimedia
For Librarians
space
Advanced Search
CogNet Library: Journals
Neural Computation
The MIT Press
Volume 4 Issue 2
Mar 19, 1992
ISSN: 08997667
Neural Computation
Volume 4 : Issue 2
Table of Contents
First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's Method
Roberto Battiti
Page
141
Efficient Simplex-Like Methods for Equilibria of Nonsymmetric Analog Networks
Douglas A. Miller and Steven W. Zucker
Page
167
A Volatility Measure for Annealing in Feedback Neural Networks
Joshua Alspector, Torsten Zeppenfeld and Stephan Luna
Page
191
What Does the Retina Know about Natural Scenes?
Joseph J. Atick and A. Norman Redlich
Page
196
A Simple Network Showing Burst Synchronization without Frequency Locking
Christof Koch and Heinz Schuster
Page
211
On a Magnitude Preserving Iterative MAXnet Algorithm
Bruce W. Suter and Matthew Kabrisky
Page
224
Learning Complex, Extended Sequences Using the Principle of History Compression
Jrgen Schmidhuber
Page
234
A Fixed Size Storage O(n
3
) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks
Jrgen Schmidhuber
Page
243
How Tight Are the Vapnik-Chervonenkis Bounds?
David Cohn and Gerald Tesauro
Page
249
Working Memory Networks for Learning Temporal Order with Application to Three-Dimensional Visual Object Recognition
Gary Bradski, Gail A. Carpenter and Stephen Grossberg
Page
270
© 2010 The MIT Press