MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

The CogNet Library : References Collection
mitecs_logo  The Handbook of Multisensory Processes : Table of Contents: Cross-Modal Interactions Evidenced by the Ventriloquism Effect in Humans and Monkeys : Introduction
Next »»
 

Introduction

Introduction

The external world is filled with objects of interest and importance, many of which can be defined by several different sensory modalities. For example, an object generally has a specific shape and color, makes a distinctive noise under certain conditions, has a particular texture and weight, and in some cases has a characteristic smell and taste. Normally, all of these sensory attributes can be readily discerned and combined by an observer to form a unified percept of a single object. Even when the input from one sensory modality is discordant with that from other modalities, it may nevertheless be judged as consistent, and juxtaposed to form a congruent item. However, when multiple sensory attributes convey contradictory information, or when the input from one sensory modality is significantly different from the input from the others, the consequent perception is of two or more distinct objects. The manner in which information from different sensory modalities is combined to form single or multiple objects is intriguing, but there is no clear understanding of the “rules” that govern multisensory integration at the level of perception or at the level of the nervous system.

One way in which this issue is investigated is to manipulate the sensory stimuli significantly enough so that the information taken in is inconsistent with a single object. This results in the percept of a “normal” single object, an “unusual” single object, or two (or more) separate objects. Multiple studies have been performed in which discordant auditory, visual, and/or somatosensory information is presented to an observer and the resulting perception is measured (Hay, Pick, & Ikeda, 1965; Pick, Warren, & Hay, 1969; Thomas, 1940; see also Welch & Warren, 1980). In experiments such as these, the influence of one stimulus modality on another can be elucidated, and the level of congruency of inputs necessary for the perception of one versus multiple objects can be inferred. These studies have led to a better understanding of how different sensory modalities are integrated, but they have not provided data regarding the underlying neuronal mechanisms of this integration.

In separate experiments, the activity of central nervous system (CNS) structures has been measured during the presentation of multisensory stimuli, and several brain regions have been identified that respond to two or more sensory modalities. However, these experiments do not directly address how the global neuronal activity relates to the perception of multisensory stimuli. This lacuna is largely due to the lack of an effective animal model that can be used to identify not only the areas of the brain that are responsive to multisensory stimuli, but also the fundamental characteristics of the neurons themselves. Such a model would allow the neuronal activity at the level of the single neuron to be correlated directly with the various stimuli. Ideally, this information could be incorporated into a model wherein the animal signifies its perception of a single object or multiple objects (or other perceptual details) while multisensory stimulus parameters are actively manipulated during neuronal recording. Such data would be eminently useful in determining how the human brain creates cogent perceptions from the innumerable stimuli that it is presented with daily.

This chapter focuses on studies of auditory and visual cross-modal interactions at the perceptual level in humans and monkeys in order to conceptualize simple neuronal models of sensory integration necessary for the perception of multisensory objects. Issues central to the interactions and integration of auditory and visual stimuli in human subjects are presented and reviewed, and evidence that an appropriate animal model can be developed which can then be used to perform direct analyses on the relationship between neural activity and perception is presented.

 
Next »»


© 2010 The MIT Press
MIT Logo