Researchers announce the availability of HRTF sets to aid in hearing aid performance | Hearing Review September 2014

By Sridhar Kalluri, PhD, and Simon Carlile, PhD

Head-related transfer functions, or HRTFs, represent the location-dependent information about sound sources used by the brain. This article describes extensive sets of head-related transfer functions for different styles of hearing aids that have recently been made available for researchers, developers, and any interested clinicians. It is hoped that the new data set is valuable for designers of hearing aids and signal processing algorithms.

While visual localization of objects in space is based in large part on which part of the retina is stimulated, there is no such direct correspondence between object location and activation of the sensory organ for use in localizing sound sources. The localization of a sound source is based on the encoding and interpretation by the brain of complex location-dependent acoustic variations created by the physical interaction of sound waves with the body and head.1 These directionally dependent physical acoustic cues enable spatial hearing—that is, a perceptual sense of space in the listening experience.

Examples of the type of problem for which this data set will be useful are the optimization of microphone placement and the design of directional-filtering algorithms for the different hearing aid styles

Spatial hearing is a critically important element of functioning effectively in daily life. It enables one to locate sound sources, which, for example, facilitates navigating a busy world where safety can depend on correctly locating a moving bus or a traffic signal. Moreover, and perhaps more importantly, spatial hearing is a critical element of effective communication in noisy environments, where the ability to locate sound sources can help the brain focus attention on a desired sound source while inhibiting undesired sound sources.

Spatial hearing confers a sense of awareness and presence, aspects of hearing that enable flexibility and fluidity in adjusting listening goals in complex, busy environments. Additional benefits come from enrichment of the auditory experience of scenarios that are important to most listeners, such as live music and movies.

Hearing Aids, the Body, and Spatial Hearing

Hearing aids can affect the acoustic cues underlying spatial hearing. A quantitative understanding of how spatial hearing cues are modified is critical for ensuring that hearing aid users get the full range of benefits made possible by spatial hearing.

While the signal processing performed by a hearing aid is an important determinant of how spatial cues are modified by a hearing aid, another major factor is the geometry of the hearing aid case and concomitant placement of the hearing aid microphone. Location-dependent acoustic variations emerge from the physical interaction—scattering, interference, and diffraction—of sound waves with the body, head, and ear. The hearing-aid case modifies these wave-interaction effects, and the placement of the microphone affects how these wave-interaction effects are sampled and conveyed to the listener’s auditory system.


The asymmetry created by the relatively smaller anatomy of the pinnae results in cues for distinguishing sound sources in the vertical plane. These cues occur primarily at high frequencies due to the small size of the geometric features in the pinnae.
Head and pinnae effects. The wave-interaction effects occur when the physical obstacle has dimensions that are comparable to or larger than the wavelength of the sound. Accordingly, the head and body create between-ear differences in level and time that vary strongly with the direction of the sound source in the horizontal plane. These relatively large elements of the body do not create cues for distinguishing the direction of sound sources in the vertical plane (front-back, up-down) because of the generally symmetric configuration of the head about the axis formed by a line between the ears.

Toward an Understanding of Wave Interactions and HRTFs

The wave interactions created by the pinnae are quite complex, particularly at high frequencies where pinna geometries can be quite intricate. Physical models that accurately predict acoustic cues from geometric features have not matured greatly due to their computational expense and also the availability of alternative empirical measurement techniques for characterizing cues.

KalluriFig1

Figure 1. The head-related transfer function (HRTF) is the frequency response of the ear for sound presented from a loudspeaker, as in this figure that shows the HRTF measured at the entrance to the ear canal for a loudspeaker straight in front.

For example, a microphone placed at the entrance of the ear canal can measure the filtering function of the ear (ie, the head-related transfer function, or HRTF, as shown in Figure 1). As the system works linearly, any sound could then be filtered using the HRTF to get an estimate of the actual perceived spectrum. An HRTF set—the aggregation of HRTFs from every direction around the head—summarizes all the location-dependent variation of an acoustic signal. Systematic location-dependent variations of HRTFs in a given set help identify the acoustic cues available for human listeners to use for localizing sounds (Figure 2).

HRTF Sets for Five Hearing Aid Styles and Reference

KalluriFig2

Figure 2. A set of HRTFs captures the location-dependent variations of the ear’s frequency response. Shown here is a portion of a set of HRTFs measured at the entrance to the ear canal, for loudspeaker locations on the midline separating the left and right sides of the body (labels identify the loudspeaker position of some locations), arrayed in an arc from front and below the head to straight in front to above the head to directly behind the head. Systematic changes in the frequency response, for example the frequency of a notch in the 4-10 kHz range, are evident as the location of the loudspeaker changes.

Recently, the authors and their colleagues created and made available for general use the HRTF sets for five different hearing aid styles:

  • Behind-the-ear (BTE);
  • In-the-ear (ITE);
  • In-the-canal (ITC);
  • Completely-in-the-canal (CIC); and
  • Invisible-in-the-canal (IIC).

The data set also includes, for reference, an HRTF set for the blocked-meatus configuration that is standard in non-hearing-aid applications.2

The HRTF sets were measured in a collaborative research project between Starkey Hearing Technologies, VAST Audio, and the University of Sydney. A peer-reviewed article3 in the Journal of the Acoustical Society of America gives details of the measurement procedures and technical details of the data set.

Given the sometimes large variations in HRTFs for small changes in sound-source location, especially at high frequencies, a dense sampling of locations is valuable in HRTF sets. The HRTF sets reported here were recorded for 393 different loudspeaker locations. The loudspeaker locations were distributed all around the head spaced very finely approximately 10° apart, with only locations more than 45° below the horizontal plane not being sampled. The measurements were made with a head-and-torso simulator (HATS) mannequin combined with an impression model of the original HATS pinna and an impression model of the ear canal of a human subject.

We are making this data set available at http://www.starkeyevidence.com in the section on Research Resources. It will be valuable in many contexts. Examples of the type of problem for which this data set will be useful are the optimization of microphone placement and the design of directional-filtering algorithms for the different hearing aid styles.

kalluri author box References

1. Blauert J. Spatial Hearing. Cambridge, Mass: MIT Press; 1997.

2. Moller H. Fundamentals of binaural technology. Applied Acoustics. 1992;36(3):171-218.

3. Durin V, Carlile C, Guillon P, Best V, Kalluri S. Acoustic analysis of the directional information captured by five different hearing aid styles. J Acoust Soc Am. 2014;136:818-828

Original citation for this article: Kalluri S, Carlile S. Mapping HRTFs: Acoustic cues for three-dimensional spatial hearing across hearing aid styles. Hearing Review. 2014;21(9):34-37.