Mutant Zebra Fish May Help Discover Cause of Fraser Syndrome Hearing Loss

Reporting in the August issue of the journal Development, researchers at the University of Oregon’s Institute of Neuroscience have identified a potential developmental pathway worthy of more scrutiny in future research into Fraser syndrome, a many-faceted and rare recessive genetic disease. In humans, a mutation in the gene FRAS1, which plays a role in skin epithelial formation during early development, has been linked to Fraser syndrome. A comparable version of the gene, fras1, in zebra fish is required for stable skeletal formation.

In the study, researchers modeled craniofacial symptoms related to hearing loss in Fraser syndrome using mutant zebra fish, focusing on an endodermal pouch (known as p1), which in humans forms the Eustachian tube.

Using tissue labeling and time-lapse microscopy, the research team found “a previously unrecorded, late-forming portion of the first pharyngeal pouch in the zebra fish,” said lead author Jared Coffin Talbot, now a postdoctoral researcher at Ohio State University.

The newly seen component, researchers wrote, is a fras1-dependent “endodermal outpocket” and was referred to in the paper as a late-p1. “If this homology can be taken as a guide, then endodermal pouching defects might underlie some ear defects in Fraser patients,” the researchers concluded. They proposed that in normal development, late-p1 holds apart skeletal elements found fused in fras1 mutants.

To test that idea, the researchers added healthy epithelia tissue from wild-type zebra fish, into fras1-mutant embryos. Doing so allowed for normal facial development in the mutants.

“To my knowledge, the connection between skeletal development and Fraser syndrome deafness has not yet been made in mammals,” Talbot said. “The literature has been largely mute as to why skeletal defects are found in fras1 mutants; this is what made it an interesting topic to study, but it does make a direct zebrafish-human connection more difficult to make. We believe that the middle ear and Eustachian tube are the homologous structures to what we’re studying in zebra fish.”

The scientists also learned that zebrafish fras1 acts in tissues homologous to the human ear canal to sculpt tissues homologous to the human middle ear.


NIH: Brains of People Born Deaf Process Touch Differently

New research funded by the National Institutes of Health (NIH) shows that people who are born deaf process the sense of touch differently than people who are born with normal hearing, The finding reveals how the early loss of a sense, such as hearing, can affect brain development.

The study, published in The Journal of Neuroscience, was conducted by a team led by Christina M. Karns, PhD, a postdoctoral research associate in the Brain Development Lab at the University of Oregon, Eugene, and her colleagues.

According to the authors, deaf people use the auditory cortex to process touch stimuli and visual stimuli to a much greater degree than occurs in hearing people. The finding suggests that since the developing auditory cortex of profoundly deaf people is not exposed to sound stimuli, it adapts and takes on additional sensory processing tasks.

Previous research has shown that people who are born deaf are better at processing peripheral vision and motion. Deaf people may process vision using many different brain regions, especially auditory areas, including the primary auditory cortex. However, no one has tackled whether vision and touch together are processed differently in deaf people.

Karns and her colleagues developed a unique apparatus that could be worn like headphones while subjects were in a magnetic resonance imaging (MRI) scanner. Flexible tubing, connected to a compressor in another room, delivered soundless puffs of air above the right eyebrow and to the cheek below the right eye. Next, visual stimuli in the form of brief pulses of light were delivered through fiber-optic cables mounted directly below the air-puff nozzle. Functional MRI was then used to measure reactions to the stimuli in Heschl’s gyrus, the site of the primary auditory cortex in the human brain’s temporal lobe, as well as other brain areas.

The researchers took advantage of an already known perceptual illusion in hearing people known as the auditory induced double flash, in which a single flash of light paired with two or more brief auditory events is perceived as multiple flashes of light. A double puff of air was used as a tactile stimulus to replace the auditory stimulus, but kept the single flash of light. Subjects were also exposed to tactile stimuli and light stimuli separately and time-periods without stimuli to establish a baseline for brain activity.

Hearing people exposed to two puffs of air and one flash of light claimed to see only a single flash. However, when exposed to the same mix of stimuli, the subjects who were deaf saw two flashes. Looking at the brain scans of those who saw the double flash, the scientists observed much greater activity in Heschl’s gyrus, although not all deaf brains responded to the same degree. The deaf individuals with the highest levels of activity in the primary auditory cortex in response to touch also had the strongest response to the illusion.

“We designed this study because we thought that touch and vision might have stronger interactions in the auditory cortices of deaf people,” said Karns. “As it turns out, the primary auditory cortex in people who are profoundly deaf focuses on touch, even more than vision, in our experiment.”

There are several ways the finding may help deaf people. If touch and vision interact more in the deaf, touch could be used to help deaf students learn math or reading. The finding also has the potential to help clinicians improve the quality of hearing after cochlear implants, especially among congenitally deaf children implanted after the ages of 3 or 4. These children, who have lacked auditory input since birth, may struggle with comprehension and speech because their auditory cortex has taken on the processing of other senses, such as touch and vision. These changes may make it more challenging for the auditory cortex to recover auditory processing function after cochlear implantation. Being able to measure how much the auditory cortex has been taken over by other sensory processing could generate intervention programs that would help the brain retrain and devote more capacity to auditory processing.


Studies on Deaf Children May Help Decode Dyslexia

Doctors have known for years that those with dyslexia process information differently than others, like seeing words with transposed letters. Now, researchers at Ohio State University Wexner Medical Center show evidence that people with dyslexia might also hear language differently, as well.

“Any sort of language problem could very well have its roots in perception itself,” said Susan Nittrouer, PhD, of the Ohio State University Wexner Medical Center. “To many people, the solution seems to be, ‘Well, we will just train those with dyslexia how to recognize words correctly.’ But the problem is really more fundamental than that.”

Nittrouer said she began to suspect the role hearing might play in dyslexia after nearly a decade-long study involving children who were born deaf or with profound hearing loss. “We began following this group of over 100 children, basically, since they were infants,” Nittrouer said. All the children in the study got cochlear implants, which use microphones mounted just behind their ears, to capture and feed sound waves to nerves near the brain.

Through consistent testing, researchers found that the implants made a remarkable difference in terms of a child’s ability to hear, and they also raised some questions.

“Cochlear implants have been able to help children who are deaf basically function as hearing children do,” said Nittrouer. “However, once you begin to scratch the surface, you often find that children who have cochlear implants function similarly to how children who have dyslexia function.”

Nittrouer says that’s important because it points to the role of hearing in dyslexia. “Given that they look so much like children with dyslexia,” said Nittrouer, “we can really connect the dots between their perception, the kind of signal that they’re getting, and the sort of language problem that results.”

Even more encouraging is that researchers say they spotted problems in these children long before they would have been obvious in dyslexic children who have normal hearing.

“We were able to identify these emerging problems in these children at kindergarten,” Nittrouer said. “If they didn’t have cochlear implants and weren’t in this project, in all likelihood, these problems learning to read would not have shown up until they were in third grade.”

Nittrouer says more research needs to be done, but her findings could eventually lead to developing tests that could be administered earlier to predict who might be at risk for dyslexia.

“If, indeed, it turns out that these children have broader perceptual issues, then we need to begin to put together a broader intervention approach,” said Nittrouer, “one that does not involve just pulling the children out of the classroom for 20 minutes of tutoring a few times a week, but rather a program that involves all educators for the child’s entire day at school.”