AirPods Pro Become Hearing Aids in iOS 14

Apple tackles hearing loss accessibility issue with "Headphone Accommodations" and AirPods Pro

By Abram Bailey, AuD

The long-awaited missing feature—Headphone Accommodations in Transparency Mode—is now available (as of the Sept 14 release of AirPods Pro firmware version 3A283). After upgrading, please restart your devices to enable the new functionality.

I have now tested Headphone Accommodations in Transparency Mode, and can confirm that it does work, within limits. The amplification is not 100% perfect, and AirPods won’t work for more severe hearing losses, but for those with mild-to-moderate hearing loss, they should provide some benefit.

Dr Cliff performs a detailed review of the Apple AirPods Pro after converting them into hearing aids using the Mimi Hearing Test App.

When the news broke

Buried near the bottom of Apple’s recent iOS 14 press release, under “Additional iOS Features”, Apple included a mention of the new Headphone Accommodations accessibility feature, which “amplifies soft sounds and tunes audio to help music, movies, phone calls, and podcasts sound crisper and clearer.”

Airpods Pro

In the iOS 14 feature breakdown, Apple states that “Headphone Accommodations also supports Transparency mode on AirPods Pro, making quiet voices more audible and tuning the sounds of your environment to your hearing needs.” This is the extremely exciting part, as it indicates that AirPods can now provide some of the same functionality that you might expect from a hearing aid; personalized amplification (and noise reduction) to make it easier to hear those around you.

Fixing the audio lag problem

While there are plenty of third-party apps that already offer customized AirPods amplification for everyday sounds, the big issue with hearing-aid-like amplification has been the latency (or lag) introduced by processing audio on the phone and transmitting it to the AirPods for playback. “The challenge of using smartphone processing is that auditory information must be presented within 80 milliseconds,” says Chad Ruffin, MD. “If processing and relaying this information cannot occur during this time, it will make communication harder. This is because the lipreading cues can become out of sync with the amplified audio.”

The challenge of using smartphone processing is that auditory information must be presented within 80 milliseconds. If processing and relaying this information cannot occur during this time, it will make communication harder. This is because the lipreading cues can become out of sync with the amplified audio.

Chad Ruffin, MD

Via my communications with Apple, I have confirmed that the embedded H1 chip, with its 10 audio cores, will be used to locally process and amplify sound, and Apple is promising “incredibly low audio processing latency.” This is huge news for millions of AirPods pro owners!

Setting up Headphone Accommodations

For those with early access to iOS 14, go to Settings ⇾ Accessibility ⇾ Audio/Visual to turn on Headphone Accommodations. You will see “Custom Audio Setup” link which—based on the leaked screenshots below—appears to guide the user through a series of listening tests. The results of the listening tests presumably enable clearer speech for phone calls, media, and real-world conversion (AirPods Pro only).

Headphone Accommodations 2

Detailed walkthrough instructions

For a more detailed walkthrough of setting up Headphone Accomodations in Transparency Mode, check out this new video from Adam Carlan.

Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.

Using an “audiogram” to customize amplification

An option to use “an audiogram from [Apple] Health to customize your audio” is also displayed in the leaked screenshots. This seems to indicate that the AirPods Pro will be capable of providing a very fine-tuned custom amplification experience, based on the audiogram (pitch-by-pitch hearing abilities) unique to the user. With third-party apps like Mimi, you can test your hearing and generate an audiogram, and with iOS 14, it looks like that audiogram can serve as the foundation for personalized amplification.

Easier customization on the way

In May (2021), Apple announced a new feature that will allow users to input their clinical audiograms directly into Apple Health. This means that it will be possible, for the first time, to use the results of a diagnostic hearing to customize the amplification settings of your AirPods Pro. It looks like it will be possible to simply snap a picture of your audiogram and check the thresholds that appear on your iPhone before accepting the automatically produced digital audiogram.

Skipping “Custom Audio Setup”

If you don’t feel like going through the listening tests, or loading in an audiogram, you’re in luck. Apple is also providing some general purpose audio enhancement options that should help those with milder forms of hearing loss. In the “Headphone Audio” settings, you will have the option to “tune audio” based on the following options:

  • Balanced Tone
  • Vocal Range
  • Brightness

Apple will also introduce a slider to increase the “boost” for soft sounds, from “slight” to “strong”. The new audio settings will work with Transparency mode on AirPods pro and phone calls and media on AirPods Pro, AirPods (2nd generation), EarPods, Powerbeats, Powerbeats Pro, and Beats Solo Pro.

Apple’s accessibility history

Apple has a long history of supporting the Deaf and Hard of Hearing communities. Apple was the first to enable Teletype (TTY) and real-time text (RTT) calling directly on device. For the uninitiated, TTY and RTT provide a text-based chat alternative to voice for those who are not able to hear clearly on the phone. With TTY, the user must hit the send button before the chat is transmitted, and with RTT, chats are sent in real-time, as text is typed.

In 2014, Apple launched the Made for iPhone hearing aid program and designed a completely new Bluetooth Low Energy protocol for the hearing industry. This enabled—for the first time—seamless audio streaming connectivity between a hearing device and smartphone. Hearing aid wearers could easily stream phone calls, FaceTime calls, music, Siri, etc, with clear sound. Apple licensed this technology for free to hearing aid manufacturers. Today, Apple technology is built into more than 100 hearing aid and cochlear implant models.

In 2018, Apple introduced Live Listen on AirPods, enabling customers to use their iPhone as a remote directional microphone. Remote microphones are widely endorsed by the hearing healthcare industry and provide better hearing in noisy settings, lecture halls, or when sitting far away from the speaker. Live Listen also works with Made for iPhone hearing aids.

That brings us to today, where we are eagerly awaiting Apple’s latest accessibility marvels—Headphone Accommodations and personalized amplification for everyday sounds via AirPods Pro.

Other important accessibility updates from Apple

While Headphone Accommodations and personalized amplification on AirPods Pro may have stolen the show, there are plenty of additional hearing accessibility updates coming this fall. Here’s a quick summary of some of the other announcements:

Sound Recognition

A new setting that will notify a user on their device about sounds or alerts detected by their iPhone, iPad, or iPod Touch. These cover a range of categories, such as sirens, smoke alarms, doorbell chimes, and appliance beeps.

Sound Rec

Group FaceTime

Group FaceTime now detects when a participant is using sign language and makes the person prominent in a video call.


RTT Improvements

Apple has made it simpler for RTT users to engage with calls and incoming RTT messages through notifications — even when they’re not in the phone app and don’t have RTT conversation view enabled.

Hearing health

Following the introduction of the Noise app in watchOS 6 that measures ambient sound levels and duration of exposure, watchOS 7 adds further support for hearing health with headphone audio notifications.


Through iOS 14 and watchOS 7, users can now understand how loudly they are listening to media through their headphones using their iPhone, iPod, or Apple Watch, and when these levels may impact hearing over time. The hearing control panel displays a live UI showing whether what you’re playing is above recommended limits. When this reaches safe weekly listening amounts, Apple Watch provides a notification to the wearer.