Wednesday, March 3, 2010

Language Areas Of Brain Develop Without Hearing; Study of Deaf Yields Surprises

In most human beings, the capacity to understand, speak and read one's native language resides in specific regions of the left side of the brain. But what about deaf people who have grown up speaking sign language?

Researchers have wondered for years whether an infant's ability to hear sounds might play a role in determining which brain regions will assume responsibility for language functions. Are the classic "language areas" of the brain's left temporal, parietal and frontal lobes biologically hard-wired for the job? Or does sensory input from the ears program them for language tasks?

The findings of a study presented here last week at the annual meeting of the Society for Neuroscience clearly show that the ability to hear sounds isn't necessary for the development of the brain's language areas. In a deaf person whose native language is American Sign Language ( ASL) , watching signed sentences activates nerve cells in the same left-sided brain regions that a hearing person uses while engaging in conversation.

But in addition to those left-brain regions, researchers found, to their surprise, that watching ASL also activated areas of the right side of the brain. The scientists speculated that this may occur because someone observing sign language must constantly use visual and spatial skills controlled by the right half of the brain to keep track of the position and direction of gestures. And the researchers found that when reading English, these deaf people used a different part of the brain than hearing people did.

"The processing needed to perceive a sign, with its movements of arms and hands, requires additional resources to those necessary in processing sound or the shapes of a letter," said David Corina, an assistant professor of psychology at the University of Washington and one of the scientists on the study team. Researchers from the University of Oregon, Georgetown University, the National Institutes of Health and London's Institute of Neurology also participated.

The study compared brain activation patterns in three groups: eight hearing people, 16 people born deaf who had learned ASL as their native language (with English acquired later) and 13 hearing people born to deaf parents. The third group were bilingual: They had grown up using both ASL and English from early childhood. ( ASL is a formal, grammatical language that is distinct from both English and other sign languages.)

The deaf subjects were students at Gallaudet College in Washington. Corina said most of the hearing children of deaf parents were people working as sign language interpreters in the Washington area. The other hearing subjects were chosen from a pool of research volunteers.

All of the subjects underwent brain-imaging at the National Institutes of Health through functional Magnetic Resonance Imaging (MRI). The technique detects differences in blood flow and oxygen supply to various regions of the cerebral cortex, the brain's outer layer. Thus it can show which regions are using the most energy during a mental task.

MRI imaging was performed while the subject watched a video showing a person alternately signing coherent ASL sentences and strings of nonsense signs.

In the hearing subjects who didn't know ASL, there was little activation. But in both the deaf subjects and the hearing children of deaf parents, watching the coherent ASL sentences activated the major language-processing regions of the left hemisphere, as well as portions of the right parietal and temporal lobes.

The researchers also tested the three groups' responses to reading English. MRI scans were performed while a subject read English sentences interspersed with strings of nonsense words.

In both groups of hearing subjects (for whom English was a "native language"), reading the sentences activated the same left-brain areas normally used in language processing. But in the deaf subjects who had acquired English late, "the pattern of activation is remarkably different," said Georgetown University neurologist Daphne Bavelier. "None of the classical [language] structures on the left" were activated. Instead, regions of the right parietal and temporal lobes lit up on the scans as the subjects read the English sentences.

The researchers aren't sure what that means. Corina said one possibility is that the deaf subjects were using their right-brain ASL skills to interpret written English words.

He said the team hopes to do additional studies comparing the subjects' responses to spoken language, as well as exploring further how the brains of bilingual or multilingual people handle different languages. "One wants to look at how much the established language areas [on the left side of the brain] participate in the acquisition of a new language," he said.
______________________
References:

Okie, Susan. 1996. "Language Areas Of Brain Develop Without Hearing; Study of Deaf Yields Surprises". Washington Post. Posted: November 26, 1996. Available online: http://www.princeton.edu/~browning/news/okie.html

No comments: