Research Roundup | November 2013 Hearing Review

Research Roundup

French Horn Players at Significant Risk of Noice-Induced Hearing Loss

Professional French horn players may need to seriously consider adopting effective strategies to prevent noise-induced hearing loss (NIHL). A new study published online in the Journal of Occupational and Environmental Hygiene found further evidence that French horn players are one of the most at-risk groups of developing NIHL among professional orchestral musicians.

“Using both conservative and lenient criteria for hearing loss and correcting for age, we found that between 11% and 22% of the participants showed some form of hearing loss typical of NIHL,” said study investigator Ian O’Brien, MPhil, MAudSA, CCP, a doctoral degree candidate at the University of Sydney and a professional French horn player. “Looking at those aged 40 years or younger and also correcting for age, the number of horn players with an apparent hearing loss rose to between 17% and 33%.”

The study, conducted by researchers from the University of Queensland and the University of Sydney at the 2010 annual gathering of the International Horn Society in Brisbane, Australia, examined the hearing of 144 French horn players. The investigators performed audiometric assessments and measured sound levels and hearing thresholds.

O’Brien and his colleagues also administered a questionnaire to investigate the horn players’ safety practices and attitudes about hearing conservation. “We were surprised to find that only 18% of participants reported using any form of hearing protection,” said lead investigator Wayne Wilson, PhD, MAudSA, CCP. “Even within that 18%, the use of hearing protection appears to be inadequate with 81% of these participants reporting their frequency of use as ‘sometimes’ and 50% reporting they use generic, foam, or other inferior forms of protection.”

The NIDCD recommends preventing NIHL by regularly using hearing protectors, such as earplugs or earmuffs. The Hearing Review has published many articles on this topic, including recent articles about music faculties’ views on hearing protection, seminars for collegiate musicians, and two special editions on music and hearing loss.

“Our findings also reinforce the need to educate horn players, their mentors, and audiologists about the need to protect hearing and how best to achieve this while still enabling musicians to play to the highest level,” said O’Brien. “Even mild hearing loss can result in difficulties discriminating pitch, abnormal loudness growth, and tinnitus, all of which can affect a musician’s ability to perform, subsequently jeopardizing his or her livelihood.”

The Bureau of Labor Statistics reports that there have been nearly 125,000 cases of permanent hearing loss in workers since 2004. In addition to hearing loss, exposure to high levels of noise can result in physical and psychological stress, reduced productivity, poor communication, and accidents and injuries caused by a worker’s inability to hear warning signals.

According to Torey Nalbone, PhD, CIH, associate professor at the University of Texas at Tyler, and an AIHA noise exposure expert, “Traditionally, we have examined rock and roll artists and their hearing loss, but few think of the hearing loss experienced by symphonic orchestra players. The presence of loss of hearing acuity in the ranges documented in this study demonstrates that orchestral musicians should take a more active role in conserving their hearing…The appropriate use of hearing protection devices can and will reduce the incidence of NIHL. This could be an important attitude and habit to change for these horn players and others in an orchestral setting, especially when they depend on their hearing for a major portion of their success during performances.”

We Like Our Own Voices

It turns out we really do like the sound of our own voice. We just may not realize it. A new study by Albright College (Reading, Pa) associate professor of psychology Susan Hughes, PhD, has found that people unknowingly assessed their own recorded voice as sounding more attractive in comparison to how others rated their voices, which is considered a form of unconscious self-enhancement. The article suggests that participants also may have preferred their own voice due to a mere exposure effect and the tendency to like the familiar. This effect may have still been a factor even if participants were not overtly aware they were hearing their own voice, according to the study. The paper, “I Like My Voice Better: Self-Enhancement Bias in Perceptions of Voice Attractiveness,” appears in the October edition of Perception.

Small Group of Neurons Provide Base for Higher Localization Processing in Brain

As Baby Boomers age, many experience difficulty in hearing and understanding conversations in noisy environments such as restaurants. People who are hearing-impaired and who wear hearing aids or cochlear implants are even more severely impacted. Researchers know that the ability to locate the source of a sound with ease is vital to hear well in these types of situations, but much more information is needed to understand how hearing works to be able to design devices that work better in noisy environment. 

Researchers from the Eaton-Peabody Laboratories of the Massachusetts Eye and Ear, Harvard Medical School, and Research Laboratory of Electronics, Massachusetts Institute of Technology, have gained new insight into how localized hearing works in the brain. Their research paper, “Decoding Sound Source Location and Separation Using Neural Population Activity Patterns,” was published in the October 2, 2013 issue of the Journal of Neuroscience. 

“Most people are able to locate the source of a sound with ease, for example, a snapping twig on the left, or a honking horn on the right. However, this is actually a difficult problem for the brain to solve,” said Mitchell L. Day, PhD, an investigator in the Eaton-Peabody Laboratories at Massachusetts Eye and Ear and an instructor of Otology and Laryngology at Harvard Medical School. “The higher levels of the brain that decide the direction a sound is coming from do not have access to the actual sound, but only the representation of that sound in the electrical activity of neurons at lower levels in the brain. How higher levels of the brain use information contained in the electrical activity of these lower-level neurons to create the perception of sound location is not known.”

brain In the experiment, researchers recorded the electrical activity of individual neurons in an essential lower-level auditory brain area called the inferior colliculus (IC) while an animal listened to sounds coming from different directions. They found that the location of a sound source could be accurately predicted from the pattern of activation across a population of less than 100 IC neurons—in other words, a particular pattern of IC activation indicated a particular location in space. Researchers further found that the pattern of IC activation could correctly distinguish whether there was a single sound source present or two sources coming from different directions; the pattern of IC activation could segregate concurrent sources.

“Our results show that higher levels of the brain may be able to accurately segregate and localize sound sources based on the detection of patterns in a relatively small population of IC neurons,” said Dr Day. “We hope to learn more so that someday we can design devices that work better in noisy environments.”

The research was funded by National Institute on Deafness and Other Communication Disorders (NIDCD) grants RO1 DC002258 and P30 DC005209. The paper was coauthored by Mitchell L. Day and Bertrand Delgutte. Source: Mass Eye and Ear Institute. HR thanks Mary Leach of MEEI/Harvard for her assistance.

Study Looks at Human Echolocation Skills

Some blind people can learn to navigate using the echoes of sounds they themselves make. Biologists at the Ludwig-Maximilians Universität in Munich, Germany, led by Lutz Wiegrebe of the Department of Neurobiology, have now shown that sighted people also can learn to echolocate objects in space, as reported in an article published online on August 28 in the Proceedings of the Royal Society B.

Wiegrebe and his team developed a method for training people in echolocation. With the help of a headset consisting of a microphone and a pair of earphones, experimental subjects generate patterns of echoes that simulate acoustic reflections in a virtual space: the participants emit vocal clicks, which are picked up by the microphone and passed to a processor that calculates the echoes of a virtual space within milliseconds. The resulting echoes are then played back through the earphones. The trick is that the transformation applied to the input depends on the subject’s position in virtual space. So the subject can learn to associate the artificial “echoes” with the distribution of sound-reflecting surfaces in the simulated space.

“After several weeks of training,” says Wiegrebe, “the participants in the experiment were able to locate the sources of echoes pretty well. This shows that anyone can learn to analyze the echoes of acoustic signals to obtain information about the space around him. Sighted people have this ability too; they simply don’t need to use it in everyday situations. Instead, the auditory system actively suppresses the perception of echoes, allowing us to focus on the primary acoustic signal, independently of how the space alters the signals on its way to the ears.” This makes it easier to distinguish between different sound sources, allowing us to concentrate on what someone is saying to us, for example.

The new study shows, however, that it is possible to functionally invert this suppression of echoes, and learn to use the information they contain for echolocation instead.

In the absence of visual information, we and most other mammals find navigation difficult. So it is not surprising that evolution has endowed many mammalian species with the ability to “read” reflected sound waves. Bats and toothed whales, which orient themselves in space primarily by means of acoustic signals, are the best known.

Wiegrebe and his colleagues are now exploring how the coordination of self-motion and echolocation facilitates sonar-guided orientation and navigation in humans.