Research from Northeastern University shows that, contrary to popular belief, language is not limited to speech. In a study paper published November 11, 2016 in the journal PNAS, Northeastern University investigator Iris Berent, PhD, explains that people also apply the rules of spoken language to sign language.

Language is not simply about hearing sounds or moving our mouths. When our brain is interpreting language, it projects abstract structure, and the modality (speech or sign) is secondary.

Iris Berent, PhD

Iris Berent, PhD

“There is a misconception in the general public that sign language is not really a language,” said Berent. “Part of our mandate, through the support of the NSF, is to reveal the complex structure of sign language, and in so doing, disabuse the public of this notion.” To come to this conclusion, Berent’s lab studied words and signs that shared the same general structure. They found that people reacted to this structure in the same way, irrespective of whether they were presented with speech or signs.

In the study, Berent and the research team studied words and signs with doubling—ones that show full or partial repetition. She found that responses to these forms shift, depending on their linguistic context. When a word is presented by itself (or as a name for just one object), people avoid doubling. But when doubling signaled a systematic change in meaning, participants preferred it.

Berent next asked what happens when people see doubling in signs (signs with two identical syllables). The study subjects were English speakers who had no knowledge of a sign language. To Berent’s surprise, these subjects responded to signs in the same way they responded to words. They disliked doubling for singular objects, but they systematically preferred it if (and only if) doubling signaled plurality. Hebrew speakers showed this preference when doubling signaled a diminutive, in line with the structure of their language.

“It’s not about the stimulus, it’s really about the mind, and specifically about the language system,” said Berent. “These results suggest that our knowledge of language is abstract and amodal. Human brains can grasp the structure of language regardless of whether it is presented in speech or in sign.”

Sign Language is Language

Currently there is a debate as to what role sign language has played in language evolution, and whether the structure of sign language shares similarities with spoken language. Berent’s study shows that our brain detects some deep similarities between speech and sign language. This allows for English speakers, for example, to extend their knowledge of language to sign language.

“Sign language has a structure, and even if you examine it at the phonological level, where you would expect it to be completely different from spoken language, you can still find similarities,” Berent explained. “What’s even more remarkable is that our brain can extract some of this structure even when we have no knowledge of sign language. We can apply some of the rules of our spoken language phonology to signs.”

Berent says these findings show that our brains are built to deal with very different types of linguistic inputs. The results from this study confirm what some scientists have long thought—language is language no matter what format it takes. Berent believes that this is a significant finding for the Deaf community because sign language is their “legacy.” According to Berent, sign language defines their identity, and we should all recognize its value. It’s also significant to our human identity, generally, because language is what defines us as a species.

To help further support these findings, Berent and her lab intend to examine how these rules apply to other languages. The recent study focused on English and Hebrew.

Source: Northeastern University, College of Science

Image credits: Northeastern University; Dreamstime