| |
Abstract:
Abstract: In two ERP experiments we investigated how and when
listeners relate an unfolding spoken sentence to a semantic
representation of the wider discourse. In the main experiment,
subjects listened to spoken stories, some of which contained a
critical word that, although coherent within its local carrier
sentence, didn't fit the wider discourse (e.g., "Jane told the
brother that he was exceptionally SLOW today" in a discourse where
he had in fact been very quick). Relative to a discourse-coherent
control word (e.g., QUICK), these discourse-anomalous words
elicited a standard N400 effect, emerging some 150-200 ms after
acoustic word onset. In an isolated-sentences control experiment in
which we removed the wider discourse, the earlier N400 effect
disappeared completely, confirming that it indeed depended on the
discourse. These spoken-language results are virtually identical to
earlier ERP results obtained with written language (van Berkum,
Hagoort, & Brown, 1999), showing that language users extremely
rapidly match every incoming word against their model of the
discourse. Furthermore, because even spoken words of at least 550
ms long elicited a discourse-dependent N400 effect at about 150-200
ms, the current findings reveal that lexically still very
incomplete acoustic signals are nevertheless immediately evaluated
with respect to the global discourse. van Berkum, J.J.A., Hagoort,
P., & Brown, C.M. (1999). Semantic integration in sentences and
discourse: Evidence from the N400. Journal of Cognitive
Neuroscience, 11(6), 657-671.
|