New research from Georgetown University Medical Center links motor skills and perception, specifically as it relates to what the left and right brain hemispheres “hear.”
The study, recently presented at the annual meeting of the Society for Neuroscience, is the first to match human behavior with left brain/right brain auditory processing tasks. Before this research, neuroimaging tests had hinted at differences in such processing. The findings may eventually point to strategies to help stroke patients recover their language abilities, and to improve speech recognition in children with dyslexia.
“Language is processed mainly in the left hemisphere, and some have suggested that this is because the left hemisphere specializes in analyzing very rapidly changing sounds,” says the study’s senior investigator, Peter E. Turkeltaub, MD, PhD, a neurologist in the Center for Brain Plasticity and Recovery, a joint program of Georgetown University and MedStar National Rehabilitation Network.
Turkeltaub and his team hid rapidly and slowly changing sounds in background noise and asked 24 volunteers to simply indicate whether they heard the sounds by pressing a button.
“We asked the subjects to respond to sounds hidden in background noise,” Turkeltaub explained. “Each subject was told to use his or her right hand to respond during the first 20 sounds, then the left hand for the next 20 seconds, then right, then left, and so on.”
He says when subjects were using their right hands, they heard the rapidly changing sounds more often than when they used their left hands, and vice versa for the slowly changing sounds.
“Since the left hemisphere controls the right hand and vice versa, these results demonstrate that the two hemispheres specialize in different kinds of sounds: the left hemisphere likes rapidly changing sounds, such as consonants, and the right hemisphere likes slowly changing sounds, such as syllables or intonation,” Turkeltaub explains.
“These results also demonstrate the interaction between motor systems and perception. It’s really pretty amazing. Imagine you’re waving an American flag while listening to one of the presidential candidates. The speech will actually sound slightly different to you depending on whether the flag is in your left hand or your right hand.”
Ultimately, Turkeltaub hopes that understanding the basic organization of auditory systems and how they interact with motor systems will help explain why language resides in the left hemisphere of the brain, and will lead to new treatments for language disorders, like aphasia or dyslexia.
“If we can understand the basic brain organization for audition, this might ultimately lead to new treatments for people who have speech recognition problems due to stroke or other brain injury. Understanding better the specific roles of the two hemispheres in auditory processing will be a big step in that direction. If we find that people with aphasia, who typically have injuries to the left hemisphere, have difficulty recognizing speech because of problems with low-level auditory perception of rapidly changing sounds, maybe training the specific auditory processing deficits will improve their ability to recognize speech,” Turkeltaub concludes.