Apple AirPods Pro
Screenshot of Apple AirPods Pro. Source: Apple

During its annual Worldwide Developers Conference (WWDC) on June 7, Apple announced new speech-in-noise enhancement capabilities in its upcoming iOS 15 called “Conversation Boost,” which uses computational audio and beamforming microphones for its AirPods Pro earbuds. Apple’s iOS 15 is scheduled for release in September 2021 to coincide with the new iPhone models.

“Conversation boost is designed to help people with mild to moderate hearing challenges stay more connected in conversations,” says Gagan Gupta, Apple senior engineering program manager, in the WWDC21 YouTube video (at approx 34:49). “Through computational audio and beamforming microphones, Conversation Boost focuses your AirPods Pro on the person talking in front of you, making it easier to hear and follow along in a face-to-face conversation. And to help you hear the conversations even better, you can also reduce the amount of ambient noise [via the app].

Apple Conversation Boost in App
Screenshot of “Ambient Noise Reduction” control in AirPods Pro app. Source: Apple

AirPods Pro currently (iOS 14) have three noise-control modes: Active Noise Cancellation, Transparency mode, and Off. Active Noise Cancellation uses an outward-facing microphone(s) to detect and cancel external sounds (through phase cancellation), and Transparency Mode lets outside sounds in, so you can hear what’s going on around you. These features work best when there is a more snug (closed) fit on the earbuds. Apple’s Live Listen feature, introduced in 2014 and available for both hearing aid and cochlear implant users, utilizes the microphone on the iPhone to relay sound to the AirPods or hearing devices, albeit with potential latency delay effects and greater power consumption due to streaming.

The WWDC is Apple’s annual summary of updates for software engineers who are creating new functions for iOS-compatible apps and devices. At last year’s WWDC20, Apple  introduced  iOS14, which included what was seen by many in the industry as an important move towards the AirPod Pros offering more hearing aid-type features. As reported by Hearing Review last June, Apple launched a new accessibility feature called Headphone Accommodations “designed to amplify soft sounds and adjust certain frequencies for an individual’s hearing, to help music, movies, phone calls, and podcasts sound more crisp and clear. Headphone Accommodations also supports Transparency mode on AirPods Pro, making quiet voices more audible and tuning the sounds of your environment to your hearing needs.” In 2018, Apple incorporated a dBHL Tone Audiometry feature that replicates the Hughson-Westlake method (see WWDC18 YouTube video at approx 11:00 minutes, and see the September 2018 HR article by Jerger about automated audiometry) for determining the hearing threshold of an Apple device user.

Additionally, Apple announced in May a number of other features “designed for people with mobility, vision, hearing, and cognitive disabilities.”

When taking into consideration the vast array of listening enhancement devices on the market, the Hearing Review Consumer website had already ranked Apple AirPods Pro as one of the Top-10 DIY hearing options available for people with mild or near-normal hearing loss even before the addition of the new Conversation Boost feature.

Karl Strom is editor in chief of The Hearing Review and has been reporting on hearing healthcare issues for over 25 years.