MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Neural Systems Underlying Spatial Language in American Sign Language

 Karen Emmorey, Hanna Damasio, Stephen McCullough, Thomas Grabowski, Laurie Ponto, Richard Hichwa and Ursula Bellugi
  
 

Abstract:
Positron Emission Tomography was used to investigate neural regions engaged in processing constructions unique to signed languages: classifier predicates in which the position of the hands in signing space schematically represents spatial relations among objects. Ten deaf native signers viewed line drawings depicting a spatial relation between two objects (e.g., a cup on a table) and were asked to produce either a classifier construction or an ASL preposition that described the spatial relation or to name the figure object (colored red). Compared to naming objects, describing spatial relationships with classifier constructions engaged the supramarginal gyrus bilaterally and the superior parietal lobule on the right. Naming spatial relations with ASL prepositions engaged only the right SMG. Previous research indicates that retrieval of English prepositions also engages the SMG, but more inferiorly and primarily on the left. Compared to ASL prepositions, naming spatial relations with classifier constructions engaged left inferior temporal cortex -- a region known to be activated when naming concrete objects in either ASL or English. Left IT may be engaged because the handshapes in classifier constructions encode information about object type (e.g., flat surface). Overall, the results suggest more right hemisphere involvement when expressing spatial relations in ASL, perhaps because signing space is used to encode the spatial relationship between objects.

 
 


© 2010 The MIT Press
MIT Logo