Skip to content
Mailing List

Cheng. (2019). Effects of Early Language Deprivation on Brain Connectivity: Language Pathways in Deaf Native and Late First-Language Learners of American Sign Language. Frontiers in Human Neuroscience., 13, 320–320. 

This study investigated the role of early language experience in the development of neural language networks, in terms of brain structure development and language acquisition. The aim was to discover whether the critical time period and modality of language (visual or auditory) affects the connectivity of the language network. Three groups with different early language experience were compared:  1) deaf native signers of ASL, 2) native English speaking, hearing L2 signers of ASL, and 3) deaf individuals who experienced extreme language deprivation who learned ASL as their first language in adolescence or early twenties. By using diffusion tensor imaging (DTI) the study looked at the white matter fibers that connect different parts of the brain. Results showed that deaf language deprived individuals had altered microstructure in some language-related pathways when compared to deaf native signers. Further, deaf native signers showed similar connectivity within language-related pathways compared with hearing native speakers of English. Brain connectivity deficits in dorsal pathways correlated with poorer syntactic performance, but ventral language pathways showed variation, which may offer plasticity for meaning-making after the critical period of development. The study provides further evidence for the requirement of early language experience for full growth of brain language pathways.

Mayberry, Rachel, I; Chen, Jen-Kai: Witcher, Pamela.: & Klein, Denise. (2011) Age of acquisition effects on the functional organization of language in the adult brain, Brain & Language, 119, 16-29.

Using functional magnetic resonance imaging (MRI), the researchers neuro-imaged deaf adults as they performed two linguistic tasks with sentences in American Sign Language: grammatical judgment and phonemic hand judgment. Participants’ age-onset of sign language acquisition ranged from birth to 14 years; length of sign language experience was substantial and did not vary in relation to age of acquisition. For both tasks, a more left lateralized pattern of activation was observed, with activity for grammatical judgment being more anterior than that observed for phonemic-hand judgment, which was more posterior by comparison. Age of acquisition was linearly and negatively related to activation levels in anterior language regions and positively related to activation levels in posterior visual regions for both tasks.

Poizner, H., Klima, E., & Bellugi, U. (1987). What the hands reveal about the brain. Cambridge, Massachusetts: MIT Press. 

This book provided ground-breaking research demonstrating that languages are processed in the left hemisphere of the brain, regardless of language modality. The researchers also describe parallels between the different forms of aphasic impairment in sign and spoken language users. They provide examples in spoken and sign languages of the two main kinds of aphasias, one located in the anterior areas of the left hemisphere and referred to as non-fluent aphasia, and the other found in the posterior regions, referred to as fluent aphasia.

Shield A, Cooley F, Meier RP. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder. J Speech Lang Hear Res. 2017 Jun 10;60(6):1622-1634. 

This is the first study of sign language echolalia in native-signing deaf children with autism spectrum disorder (ASD). Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and both deaf and hearing children do so at similar stages of linguistic development when comprehension is relatively low. The study looked at the types of signs and patterns seen in echoes by deaf children. Results showed modality differences exhibited by deaf children’s signing echoes, specifically in terms of directionality, reduplication, and timing.  Agreement type verbs in American Sign Language use movement to indicate pronouns (I-SHOW-YOU, YOU-COPY-ME).  Findings showed a number of children’s echoes with errors in the pronoun subject or object, as they produced pure echos (not modifying directionality). Furthermore, the deaf children in this study reduplicated signs, yet hearing children with ASD do not tend to reduplicate syllables of the words they are echoing.  In addition, deaf children echo and overlap simultaneously the signs as their interlocutor, whereas hearing children with ASD do not echo words, overlapping with their interlocutors’ production of the words. The study contributes insights into the possible motivations for echolalia and the nature of language and communication in children with ASD in general.