| |
Sensory systems function as change detectors in many respects. They quickly adapt to steady-state stimuli and are easily excited by the introduction of novel stimuli. The pattern of changes in an acoustic stimulus conveys information about the nature of the sound source and the message being transmitted by the sender. Therefore the identification, discrimination, and interpretation of acoustic events depend on the ability of the auditory system to faithfully encode the temporal features of those events. This ability to respond to changes in an acoustic stimulus has been termed temporal resolution. Although most natural acoustic signals are characterized by changes in intensity as well as changes in the acoustic spectrum over time, investigations of temporal resolution have focused on intensity variations in an attempt to separate purely temporal from spectro-temporal resolving capabilities (see auditory scene analysis). Temporal resolution is limited by auditory inertia resulting from mechanical and/or electrophysiological transduction processes. Such a limitation effectively smoothes or attenuates the intensive changes of a stimulus, which reduces the salience of those changes. Impaired temporal resolution may be conceptualized as an increase in this smoothing process, and thus a loss of temporal information.
The influence of hearing impairment on temporal resolution depends on the site of lesion. For example, conductive hearing loss is often modeled as a simple attenuation characteristic and thus should not alter temporal resolution, given sufficient stimulus levels. Damage at the level of the cochlea, however, involves more than attenuation. Reduced outer hair cell function is associated with a reduction in sensitivity, frequency selectivity, and compression at the level of the basilar membrane. Each of these might influence the perception of intensity changes. For example, a loss of basilar membrane compression might provide a more salient representation of intensity changes and thus lead to improved performance on temporal resolution tasks involving such changes. Reduced frequency selectivity is analogous to broadening of a filter characteristic, which is associated with a shorter temporal response. This too might lead to improved temporal resolution. A loss of inner hair cell function, however, would reduce the quality and amount of information transmitted to the central auditory pathway, and might therefore lead to poor coding of temporal features. The altered neural function associated with a retrocochlear lesion may also lead to a less faithful representation of the temporal features of a sound.
Numerous techniques have been used to probe temporal resolution abilities; however, the two most common techniques are temporal gap detection and amplitude modulation detection (Fig. 1). Following the notion of auditory inertia, Plomp (1964) investigated the rate of decay of auditory sensation by measuring the minimum detectable silent interval between two broadband noise pulses as a function of the relative level of the two pulses. When the pulses surrounding the gap were equal in level, the minimum detectable gap was about 3 ms. Gap detection thresholds deteriorate as stimulus level falls below about 30 dB sensation level (e.g., Plomp, 1964; Penner, 1977; Buus and Florentine, 1983; Florentine and Buus, 1984). Thus, reduced audibility associated with hearing loss may result in longer than normal gap detection thresholds. For patients with conductive or sensorineural hearing loss, gap detection thresholds for broadband noise are longer than normal at low stimulus levels. At higher stimulus levels, gap thresholds are within normal limits for conductive loss but remain longer than normal for listeners with sensorineural hearing loss (Irwin, Hinchcliff, and Kemp, 1981).
Figure 1..
Schematic diagram of a two-interval, forced-choice psychophysical paradigm used to estimate gap detection thresholds (top row) and sinusoidal amplitude modulation (SAM) detection thresholds (bottom row). Stimulus waveforms are shown for each of two observation intervals. A broadband noise standard is shown in interval 1 and a noise with a temporal gap (64 ms) or amplitude modulation (6 dB) is shown in interval 2. Correct and incorrect responses are listed.
To gauge temporal resolution in different frequency regions, one may measure gap detection thresholds using band-limited noise. Results from listeners with normal hearing reveal that gap thresholds improve with increasing stimulus level up to about 30 dB sensation level (e.g., Buus and Florentine, 1983) and improve with increasing noise bandwidth (e.g., Shailer and Moore, 1983; Eddins, Hall, and Grose, 1992), but vary little with frequency region when noise bandwidth (in Hz) is held constant (e.g., Eddins et al., 1992). With hearing loss of cochlear origin, gap detection is often worse than normal using band-limited noise (e.g., Fitzgibbons and Wightman, 1982; Fitzgibbons and Gordon-Salant, 1987); however, this is not true for all listeners with cochlear hearing loss (e.g., Florentine and Buus, 1984; Glasberg and Moore, 1989; Grose, Eddins, and Hall, 1989). Thus, cochlear hearing loss does not necessarily result in poorer than normal temporal resolution.
Temporal gap detection thresholds measured for sinusoidal stimuli do not vary substantially with stimulus frequency from 400 to 2000 Hz, but increase substantially at and below 200 Hz (e.g., Shailer and Moore, 1987; Moore, Peters, and Glasberg, 1992). Although listeners with hearing impairment may have worse than normal gap detection thresholds for noise stimuli, gap detection thresholds for tonal stimuli are normal when compared at equivalent sound pressure levels and are better than normal at equal sensation levels (Moore and Glasberg, 1988). One theory consistent with these results is that gap detection is limited to some extent by the inherent fluctuations in narrow-band noise, and this effect may be accentuated by the loudness recruitment of some hearing-impaired listeners (e.g., Moore and Glasberg, 1988). Sinusoids, having a smooth temporal envelope, would not be subject to such a limitation. This leads to the possibility that temporal resolution per se may not be adversely affected by cochlear hearing loss (e.g., Moore and Glasberg, 1988). If gap detection is influenced by loudness recruitment, then one would expect a relationship between gap detection and intensity resolution. Indeed, gap detection for sinusoids is correlated with intensity resolution for sinusoids (Glasberg and Moore, 1989) and gap detection for band-limited noise is correlated with intensity resolution for band-limited noise (Eddins and Manegold, 2001). This highlights the potential role of intensity resolution in a task such as gap detection. Poor gap detection thresholds may result from poor intensity resolution, poor temporal resolution, or a combination of the two. Listeners with cochlear implants offer a unique perspective on temporal resolution in that the auditory periphery, save for the auditory nerve, is bypassed. Gap detection in such listeners, using electrical stimulation via the implant, is as good as gap detection for listeners with normal hearing using acoustic stimulation (e.g., Shannon, 1989; Moore and Glasberg, 1988). This is consistent with the notion that gap detection may not be strongly dependent upon cochlear processes.
While gap detection thresholds may be strongly influenced by a listener's intensity resolution, the amplitude modulation detection paradigm provides an opportunity to separate the affects of intensity resolution from temporal resolution. A modulation detection threshold is obtained by determining the minimum depth of modulation necessary to discriminate an unmodulated from a sinusoidally amplitude-modulated stimulus. With this technique, temporal resolution can be more completely described as the change in modulation threshold over a range of fluctuation rates (modulation frequencies). With the assumption that intensity resolution does not vary with modulation frequency, a separate index of intensity resolution may be obtained from modulation detection thresholds at very low modulation frequencies. If loudness recruitment associated with cochlear hearing loss has a negative influence on gap detection in narrow-band noise, as suggested above, then one might predict that recruitment would enhance the perception of fluctuations introduced by amplitude modulation. Using broadband noise carriers, this does not seem to be the case. Modulation detection thresholds for listeners with cochlear hearing loss may be normal or worse than normal, but are not better than normal (Bacon and Viemeister, 1985; Bacon and Gleitman, 1992). Similarly, modulation detection using band-limited noise is not worse than normal in listeners with cochlear hearing loss (e.g., Moore, Shailer, and Schooneveldt, 1992; Hall et al., 1998). Modulation detection with tonal carriers, however, tends to be better than normal in listeners with cochlear hearing loss, and the perceived depth of modulation appears to be related to the steepness of loudness growth (Moore, Wojtczak, and Vickers, 1996; Moore and Glasberg, 2001). This is quite different from amplitude-modulated noise stimuli, for which threshold does not seem to be related to loudness growth (Hall et al., 1998). As in the gap detection paradigm, there are marked differences between the results obtained with noise and tonal stimuli. Thus, it is possible that the relation between loudness growth and intensive changes is different for sinusoidal and noise stimuli.
Some listeners with cochlear pathology have worse than normal modulation detection using noise carriers, as do listeners with Ménière's disease (Formby, 1987), eighth nerve tumors (Formby, 1986), and auditory neuropathy (Zeng et al., 1999). Interestingly, listeners with cochlear implants perform as well as normal-hearing subjects on amplitude-modulation tasks (Shannon, 1992).
In summary, listeners with abnormal cochlear function often exhibit reduced performance on gap and modulation detection tasks with noise but not sinusoidal stimuli. Studies of temporal resolution using other experimental techniques have yielded results that are generally consistent with those discussed here. These results are consistent with an interpretation that cochlear pathology may not lead to reduced temporal resolution per se, but may lead to difficulty perceiving stimuli with pronounced, random intensity fluctuations (Grose et al., 1989; Hall and Grose, 1997; Hall et al., 1998). Although many hearing-impaired listeners perform as well as normal-hearing listeners on tasks involving temporal resolution, especially when stimuli are presented at optimal levels and have relatively smooth temporal envelopes, such listeners are likely to have difficulty in natural listening environments with fluctuating backgrounds. Thus, measures of gap and amplitude modulation detection using noise stimuli might have promise as predictors of communication difficulty in realistic environments.
| |