Parallel Distributed Processing, Volume 1

Explorations in the Microstructure of Cognition: Foundations

What makes people smarter than computers? These volumes by a pioneering neurocomputing group suggest that the answer lies in the massively parallel architecture of the human mind. They describe a new theory of cognition called connectionism that is challenging the idea of symbolic computation that has traditionally been at the center of debate in theoretical discussions about the mind.

The authors' theory assumes the mind is composed of a great number of elementary units connected in a neural network. Mental processes are interactions between these units which excite and inhibit each other in parallel rather than sequential operations. In this context, knowledge can no longer be thought of as stored in localized structures; instead, it consists of the connections between pairs of units that are distributed throughout the network.

Volume 1 lays the foundations of this exciting theory of parallel distributed processing, while Volume 2 applies it to a number of specific issues in cognitive science and neuroscience, with chapters describing models of aspects of perception, memory, language, and thought.

Table of Contents

  1. Preface
  2. Acknowledgments
  3. Addresses of the PDP Research Group
  4. I. The PDP Perspective
  5. 1. The Appeal of Parallel Distributed Processing

    J.L. McClelland, D.E. Rumelhart, and G.E. Hinton

  6. 2. A General Framework for Parallel Distributed Processing

    D.E. Rumelhart, G.E. Hinton, and J.L. McClelland

  7. 3. Distributed Representations

    G.E. Hinton, J.L. McClelland, and D.E. Rumelhart

  8. 4. PDP Models and General Issues in Cognitive Science

    D.E. Rumelhart and J.L. McClelland

  10. 5. Feature Discovery by Competitive Learning

    D.E. Rumelhart and D. Zipser

  11. 6. Information Processing in Dynamical Systems: Foundations of Harmony Theory

    P. Smolensky

  12. 7. Learning and Relearning in Boltzmann Machines

    G.E Hinton and T.J. Sejnowski

  13. 8. Learning Internal Representations by Error Propagation

    D.E. Rumelhart, G.E. Hinton, and R.J. Williams

  15. 9. An Introduction to Linear Algebra in Parallel Distributed Processing

    M.I. Jordan

  16. 10. The Logic of Activation Functions

    R.J. Williams

  17. 11. An Analysis of the Delta Rule and the Learning of Statistical Associations

    G.O. Stone

  18. 12. Resource Requirements of Standard and Programmable Nets

    J.L. McClelland

  19. 13. P3: A Parallel Network Simulating System

    D. Zipser and D. Rabin

  20. References
  21. Index