MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

The CogNet Library : References Collection
mitecs_logo  The Handbook of Multisensory Processes : Table of Contents: Neuropsychological Evidence of Integrated Multisensory Representation of Space in Humans : Introduction
Next »»
 

Introduction

Introduction

This chapter focuses on multisensory behavioral phenomena that can be seen in humans affected by extinction or neglect, two neuropsychological signs related to altered functions of spatial perception. Through the study of pathologic behavioral phenomena that become manifest following damage to the central nervous system (CNS), neuropsychological studies have substantially contributed to our understanding of the normal organization of brain cognitive functions, and represent the natural interface among several diverse disciplines. In this chapter we describe a series of neuropsychological investigations of multimodal spatial phenomena that are tightly linked to other fields of neuroscience, such as neurophysiology and cognitive psychology, allowing for a direct comparison with the behavioral characteristics of the cross-modal construction of space in healthy subjects and the neural bases of cross-modal spatial behavior in both animals and humans.

We review several highly convergent behavioral findings that provide strong evidence in favor of the existence, in humans, of multimodal integration systems representing space through the multisensory coding of tactile-visual and tactile-auditory events, as well as visual-auditory events. In addition, these findings show that multimodal integration systems may differ with respect to some of their functional characteristics, such as the portion of space in which the multisensory integration occurs. For example, some types of integrated processing (e.g., those that involve the tactile modality) may take place in a privileged manner within a limited sector of space closely surrounding the body surface, that is, in near peripersonal space. Alternatively, the integrated processing of auditory and visual events may occur within a larger sector of space, in far peripersonal space. These findings are entirely consistent with the functional properties of multisensory neuronal structures coding (near and far) peripersonal space in animals, as well as with behavioral, electrophysiological, and neuroimaging evidence for the cross-modal coding of space in normal subjects. This high level of convergence ultimately favors the idea that multisensory space coding is achieved through similar multimodal structures in both humans and nonhuman primates.

Research in neuropsychology, as in many other psychological domains, has historically focused on a single sensory modality at a time. Because we typically receive a simultaneous flow of information from each of our different senses in real-world situations, however, our perception of objects in the world is also the product of integrated, multisensory processing. Providing animals with many sources of input that can operate simultaneously or substitute for one another when necessary frees them from many constraints. The integration of multiple sensory cues also provides animals with enormous flexibility, so that their reaction to the presence of one stimulus can be altered by the presence of another. The combination of, for example, visual and tactile cues or visual and auditory cues can enhance the result produced by the processing of a cue provided in a single modality and can also eliminate any ambiguity that might occur when a stimulus from one modality is not fully detected.

Some of the systems responsible for such integrative sensory processing have now been documented physiologically and their relevance for space coding has been shown. However, despite a long tradition of single-cell studies on multimodal neurons in animals, the existence of these integrated systems has only recently been investigated in humans. Such a relatively small (but rapidly increasing) number of studies is surprising, given that these multisensory systems offer a unique opportunity for recovery from cognitive impairment following brain lesions. For instance, altered performance in a unimodal sensory system can be influenced—enhanced or degraded—by the activation of another modality. Impairments in spatial representation, such as extinction or neglect, may be caused by a loss of neurons representing particular locations in space in one single modality (Pouget & Sejnowski, 1997). Stimuli presented in that portion of space are neglected or extinguished (Làdavas, Berti, & Farnè, 2000), whereas stimuli presented in the intact portion of space are detected. Whenever perceptual problems due to extinction or neglect are limited to the impairment of unimodal space representations, then detection in these patients can be obtained by the activation of an integrated system coding stimuli presented in the affected space in different modalities.

The neurophysiological bases that would be necessary for producing such a perceptual enhancement through multisensory integrative processing have been revealed in nonhuman primates. For example, neurons have been reported in the putamen and in the parietal and frontal cortices that are multimodal: they respond to both tactile and visual stimuli. Other multimodal neurons respond to tactile, visual, and auditory stimuli. Besides showing multisensory responses, these neurons are strongly activated only when visual or auditory stimuli are located in spatial proximity to a particular body part (e.g., to the face and hand) where the tactile receptive field (RF) of a given neuron is located. That is, multimodal responses are evoked most effectively by presenting visual and auditory stimuli within the visual and auditory RFs extending outward from their tactile RF. Therefore, these brain areas are specialized for the coding of visual-auditory space immediately surrounding the body (i.e., near peripersonal space).

Such a multisensory integration system would be very useful in recovery from tactile extinction, and in the first part of this chapter, we review evidence of the existence in humans of an integrated visuotactile system and an auditory-tactile system responsible for coding near peripersonal space, as well as their relevance for the modulation of tactile extinction.

In addition, neurophysiological studies in the cat and monkey have also revealed the existence of multimodal neurons in the superior colliculus (SC) that synthesize visual, auditory, and/or somatosensory inputs and are relevant for behavioral responses such as attending and orienting to sensory stimuli presented in far peripersonal space. Again, the activation of this integrated visual-auditory system could be very useful for recovery from visuospatial impairments such as visual neglect. In this respect, the integration of visual and auditory information can potentially enable patients with visual neglect to detect “bimodal” stimuli for which the visual unimodal component is below the behavioral threshold. Thus, in the second part of this chapter, we provide evidence of the existence in humans of an integrated visuo-auditory system responsible for the coding of far peripersonal space and its relevance for the temporary recovery from visual neglect.

 
Next »»


© 2010 The MIT Press
MIT Logo