MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Coordinates of Multisensory Spatial Memory

 A. Pouget
  
 

Abstract:
Abstract: In the primary cortices, the sensory modalities use distinct sets of coordinates to encode location; vision is initially retinotopic, while auditory space is initially head-centered. Yet higher up in the cortical hierarchy, these sensory modalities are merged and the question arises as to which frame of reference is used for this merging process. The results of a series of pointing experiments in virtual reality suggest that the retinal frame of reference plays a central role. It appears that the location of visual, auditory, proprioceptive and imagined targets are all remembered in retinal coordinates. These results are particularly surprising for auditory targets, since a pointing motor command could in principle be computed directly from the head-centered location of the target without recovering its retinal position. The results demonstrate with the pivotal role of the visual system in human spatial function.

 
 


© 2010 The MIT Press
MIT Logo