You are here
April 27, 2015
Brain Mapping of Language Impairments
At a Glance
- Researchers mapped different language impairments to specific brain regions to reveal the basic organization of our language system.
- The findings shed light on language processing and may lead to improved diagnosis and treatment of language impairments.
Through language—which includes sounds, gestures, and signs—we communicate our knowledge and beliefs. Between 6 million and 8 million people in the United States have some form of language impairment. In aphasia, portions of the brain that are responsible for expressing and understanding language are damaged. Aphasia usually occurs suddenly, often as the result of a stroke or head injury. It may also develop slowly, as in the case of a brain tumor, an infection, or dementia.
A team led by Dr. Daniel Mirman at Drexel University and Dr. Myrna F. Schwartz at the Moss Rehabilitation Research Institute set out to better understand the basis of language by studying people with aphasia using both neuroimaging and behavioral assessment. The research was funded by NIH’s National Institute on Deafness and Other Communication Disorders (NIDCD). Results appeared online on April 16, 2015, in NatureCommunications.
The group studied 99 volunteers who had aphasia resulting from a stroke to the left side (hemisphere) of the brain. The participants averaged 58 years of age and used English as their native language.
The researchers asked the participants to complete a series of 17 language and cognitive measures that examined a wide range of language functions. These included the ability to perceive speech and process words both verbally and nonverbally (pictures), as well as short-term memory. For example, participants were asked whether 2 spoken words rhymed, to name items shown in pictures, to indicate whether spoken words were real English words or not, and to repeat lists of one-syllable words.
The scientists identified 2 major divisions in the way the language system is organized, resulting in 4 factors: the meaning versus the form of words, and speech recognition versus production. They next examined high-resolution MRI or CT brain scans of the participants to map the location of their lesions with their symptoms.
The researchers found that the 4 factors were associated with various lesion areas. For example, speech production and speech recognition were associated with damage to adjacent regions. Whereas some factors were linked to distinct brain regions, others converged, suggesting that certain areas might have broader functional significance.
“By studying language in people with aphasia, we can try to accomplish 2 goals at once: we can improve our clinical understanding of aphasia and get new insights into how language is organized in the mind and brain,” Mirman says.
“A major challenge facing speech-language pathologists is the wide diversity of symptoms that one sees in stroke aphasia,” Schwartz adds. “With this study, we took a major step towards explaining the symptom diversity in relation to a few primary underlying processes and their mosaic-like representation in the brain. These can serve as targets for new diagnostic assessments and treatment interventions.”
—by Carol Torgan, Ph.D.
- How the Brain Sorts Out Speech Sounds
- Understanding How We Speak
- Words and Gestures Are Translated by Same Brain Regions
- Voice, Speech, and Language
Reference: Neural organization of spoken language revealed by lesion-symptom mapping. Mirman D, Chen Q, Zhang Y, Wang Z, Faseyitan OK, Coslett HB, Schwartz MF. Nat Commun. 2015 Apr 16;6:6762. doi: 10.1038/ncomms7762. PMID: 2587957.
Funding: NIH’s National Institute on Deafness and Other Communication Disorders (NIDCD).