208 pp. per issue
8 1/2 x 11, illustrated
2014 Impact factor:

Journal of Cognitive Neuroscience

May 2003, Vol. 15, No. 5, Pages 718-730
(doi: 10.1162/jocn.2003.15.5.718)
© 2003 Massachusetts Institute of Technology
Language Lateralization in a Bimanual Language
Article PDF (1.02 MB)

Unlike spoken languages, sign languages of the deaf make use of two primary articulators, the right and left hands, to produce signs. This situation has no obvious parallel in spoken languages, in which speech articulation is carried out by symmetrical unitary midline vocal structures. This arrangement affords a unique opportunity to examine the robustness of linguistic systems that underlie language production in the face of contrasting articulatory demands and to chart the differential effects of handedness for highly skilled movements. Positron emission tomography (PET) technique was used to examine brain activation in 16 deaf users of American Sign Language (ASL) while subjects generated verb signs independently with their right dominant and left nondominant hands (compared to the repetition of noun signs). Nearly identical patterns of left inferior frontal and right cerebellum activity were observed. This pattern of activation during signing is consistent with patterns that have been reported for spoken languages including evidence for specializations of inferior frontal regions related to lexical–semantic processing, search and retrieval, and phonological encoding. These results indicate that lexical–semantic processing in production relies upon left-hemisphere regions regardless of the modality in which a language is realized, and that this left-hemisphere activation is stable, even in the face of conflicting articulatory demands. In addition, these data provide evidence for the role of the right posterolateral cerebellum in linguistic–cognitive processing and evidence of a left ventral fusiform contribution to sign language processing