Women conversing in sign language

Sign language spotlights the brain’s ability to adapt

By Kalen Johnson. Mentored and edited by Robert Irion

The human brain is wired to communicate. Those with a full sense of hearing use lips, tongue and ears to navigate their interactions with others. The neural circuits in their brains have adapted to handle this constant barrage of structured sound. But deaf people rely on their arms, hands and faces to communicate via sign language. These lifelong motions dramatically alter the "cognitive architecture" of their brains, research has shown.

Three projects about the impacts of signing on the brain have exposed new details about these differences—and how important it is for language scientists to study this mode of communicating more deeply. A panel of linguists discussed their latest research during a Feb. 18 session at the annual meeting of the American Association for the Advancement of Science.

One notable example of distinct neural circuitry in hearing and deaf people involves the cochlear implant: an electronic device implanted into the head that signals the cochlear nerve directly. Geo Kartheiser, a postdoctoral researcher in the DeafxLab at the Rochester Institute of Technology, and his colleagues examined electrical activity in the brain in response to sounds and visual stimuli in two groups: six young adults who received a cochlear implant early, at an average age of less than 2 years, and five young adults who received their implant at an average age of nearly 14 years.

The results were striking. Those who received the implant earlier could depend on sounds to communicate. Their auditory cortex, the region of the brain that processes signals from the ear, had developed as it typically does in people born with full hearing. In contrast, those who received the implant later often were exposed to sign language as their primary language, relying on visual signals to communicate. As a result, their brains had developed differently: areas that usually process sound responded instead to visual triggers.

These patterns demonstrate a reconfiguring of neural connections that researchers call “deaf gain,” Kartheiser said. Lack of hearing during a child's development provides “different experiences which cause the brain to adapt differently, but still in a positive way, not in a negative, maladaptive way.”

Such studies can help parents and physicians determine the best way for each child to develop cognitively, Kartheiser said. However, he added, his team must broaden its research to address crucial questions: “Should we use cochlear implants? If so, when? Should we expose children to sign language? If so, when?” It is possible that doing both at early ages could help those born with hearing loss to have more freedom in sign and spoken language as adults, he said.

Just as learning a sign language can change connections in the brain, those who are fluent in sign language can influence the language's structure by using particular signs over time. Sign linguist Lynn Hou of the University of California, Santa Barbara, has examined this process by studying dozens of videos posted online by signers for their peers to watch and comment upon.

“It represents the actual live, deaf community that we have among us,” Hou said.

She found that a recurring sign – “look at” – had two meanings. One function expresses the physical vision of seeing a particular object or person. The other expresses a psychological reaction, or how the speaker felt as they responded. Depending on which meaning the signers intended, they used different signs immediately before or after they signed look at, Hou discovered.

The results suggest that signers store chunks of language to access for later use—and that these language patterns have emerged across diverse signers, Hou said.

Although sign languages appear vastly different than spoken languages, linguist Gaurav Mathur warned against this mindset. Rather, sign language just uses a different mode of communication, said Mathur, dean of the graduate school at Gallaudet University: "We have overlooked how modality affects language."

Language components such as morphology, phonetics, syntax and semantics are all present in signed and spoken languages, Mathur noted. However, each type of language has its own constraints. Signers are limited by their anatomy and the area around their bodies, known as the gestural space—but they adapt, just as the brain does, to communicate efficiently.

For example, signing “1 day” or “9 days” involves a relatively simple compound sign using both hands and arms. However, signing “10 days” or more requires a different cognitive framework, because the sign adds another degree of motion. Simply adding the sign for “10” to the sign for “days” becomes too complex of a movement for signers to perform comfortably, as Mathur demonstrated on screen.

Both native and non-native signers devise ways to avoid complex or awkward signing motions. “One person and their grammar is not the same as another person’s grammar,” Mathur said.

The panelists agreed that the modality of sign language changes the cognitive processes in signers—as well as the brain itself. In turn, signers intuitively construct and modify language as they use it.

“I think it’s important that we get back to that basic question,” Mathur emphasized: “What can sign language tell us about what’s happening cognitively?”

Kalen Johnson is completing a Ph.D. in biomedical science at Texas A&M University. She is a regular contributor to the Aggie Voice Graduate Student blog. She also is obsessed with physiology and how the body functions—especially in response to exercise. Email her at kjohns15@tamu.edu or visit the Aggie Voice Blog to read more.

Photo: Using sign language over a lifetime alters the brain's communication circuitry, researchers have shown. Credit: SHVETS production/Pexels

March 4, 2022

ADVERTISEMENT
BWF Climate Change and Human Health Seed Grants

ADVERTISEMENT
EurekAlert! Travel Awards

ADVERTISEMENT
Eric and Wendy Schmidt Awards for Excellence in Science Communications