Neuroscience researchers from the University of Geneva (UNIGE) in Switzerland have confirmed that speech, emitted or received, produces an electrical activity in neuronal circuits that can be measured in the form of cortical oscillations, or brain waves. The researchers explain in a new paper that, to understand speech and other cognitive or sensory processes, the brain breaks down the information it receives into cortical oscillations to integrate it and give it coherent meaning. Until now, researchers had not been able to confirm whether any of these cortical oscillations played an active role in speech processing. The new study results confirm the significance of certain cortical oscillations, or brain waves, and how they must synchronize to decipher spoken language.

Human Brain Waves

Human Brain Waves

According to an article that was published in two journals, eLife and Frontiers in Human Neuroscience, the new study has elucidated the importance of brain waves, or cortical oscillations which, when not properly produced, can be associated with language disorders. The discovery was made by Anne-Lise Giraud, PhD, and her Auditory Language Group team at the Faculty of Medicine of the University of Geneva, where they created a computerized model of neuronal microcircuits that highlights the crucial role of neuronal oscillations for decoding spoken language.

To precisely identify the neurobiological processes at work when speech is heard by a human brain, Dr Giraud’s team at UNIGE worked with colleagues at Ecole Normale Supérieure in Paris to build a computerized model of neuronal microcircuits which replicates brain waves. The researchers modeled the two types of brain waves or oscillations involved in speech processing: theta and gamma. Their objective was to discover if the theta and gamma-coupled brain waves observed in the auditory cortex are key to understanding and producing speech, or if they are only an expression of electrical activity of neurons mobilized at that time.

How Brain Waves Synchronize to Process Speech

Reportedly using a large corpus of sentences pronounced by English-language speakers showing a variety of speech paces and accents, the researchers observed that the coupled oscillations split words in an intelligent way: they adapted to the pace of the speaker and correctly detected the syllabic barriers and each syllable’s identity. Theta waves could follow the syllabic pace in a flexible way and synchronize the activity of gamma waves, which could encipher phonemes–the smallest unit of spoken language that helps to distinguish a word from other words.

The researchers found that synchronizing these two oscillations is crucial to correctly understanding speech.

The researchers then tested what might happen when the two oscillations or waves were desynchronized, as might be seen in cases of dyslexia or autism. The scientists observed that dyslexic people showed an anomaly in gamma wave activity, the waves which perform phonemic division. For people with dyslexia, the format of their mental representation does not match the universal phonemic representation format, so learning written language, which is about combining phonemes with letters, becomes difficult. For people with autism, it is the speech information that is not divided up at the right place, which blocks speech deciphering.

As outlined in their article, the researchers examined the functional MRI and electroencephalographic results of thirteen people with autism and thirteen people with no specific troubles, and found that gamma and theta waves did not engage synergistically in the group with autism: theta waves failed to track speech modulations and there was no regulation of gamma oscillations, essential for deciphering the detailed spoken content of words. The researchers say that language disorders, which many autistic people suffer from, could therefore be explained by an imbalance between slow and fast auditory oscillations, an anomaly which would prevent the interpretation of sensory information and would compromise the ability to form coherent conceptual representations.

The researchers also found that the greater the desynchronization, the more significant the verbal disorder. “Of course, autistic disorders are not summed up by the inability to decipher language,” said Dr Giraud. “But this strong correlation between oscillatory anomalies in the auditory cortex and the severity of autism highlights a malfunction of cortical microcircuits, which is certainly present elsewhere in the brain. The phenomenon is most probably symptomatic of a more general issue of segmenting and coding sensory information.”

The research team is now attempting to change the rhythm of abnormal oscillations to potentially observe the consequences of this intervention on speech and other cognitive functions.

Source: Newswise; UNIGE

Photo credits: © Andreus; © Artellia | Dreamstime.com