| |
Abstract:
We study model feed forward networks as time series
predictors in the stationary limit. The focus is on complex, yet
non-chaotic, behavior. The main question we address is whether the
asymptotic behavior is governed by the architecture, regardless the
details of the weights. We find hierarchies among classes of
architectures with respect to the attractor dimension of the long
term sequence they are capable of generating; larger number of
hidden units can generate higher dimensional attractors. In the
case of a perceptron, we develop the stationary solution for a
general weight vector, and show that the flow is typically one
dimensional. The relaxation time from an arbitrary initial
condition to the stationary solution is found to scale linearly
with the size of the network. In multilayer networks, the number of
hidden units gives bounds on the number and dimension of the
possible attractors. We conclude that long term prediction (in the
non-chaotic regime) with such models is governed by attractor
dynamics related to the architecture.
|