Advertisement
ReSound image
Advertisement
Phonak image

Hearing Aids with Artificial Intelligence (AI): Review of Features, Capabilities and Models that Use AI and Machine Learning

Hearing aids are using AI to create smarter and more adaptive multipurpose devices that enhance speech, reduce noise, translate languages, track fitness and health metrics, and more. Learn about them here, plus listen to real-life sound samples of AI-driven hearing aids.
Ai Listening

AI-powered hearing devices are set to revolutionize hearing in background noise.

Advancements in artificial intelligence (AI) is having a transformative impact on today's society. AI is now finding its way into virtually all industries, from social media monitoring and marketing chatbots to self-driving cars. And, with increasing sophistication, AI is also being applied to hearing devices.

Automatic features like AI, machine learning, and activity sensors aim to revolutionize the way hearing aid users experience sound, delivering an increasingly personalized listening experience.

In this article, we explore the different types of AI used in hearing aids and provide a review of the specific hearing aid models, their features, and how they enrich the lives of users—as well as what we might expect from them in the future. We also provide real-world sound samples of AI-enabled hearing aids so you can listen for yourself.

What is AI?

AI is the science of developing machines, particularly computer systems, to perform tasks that require human intelligence. “AI is an umbrella term to cover any technology that attempts to mimic human decision-making,” according to Kevin Seitz-Paquette, Director of the Phonak Audiology Research Center (PARC) at Sonova Group in Chicago.

AI algorithms—a process or set of rules used for solving a problem or performing a computation—enable machines to carry out tasks such as recognizing patterns and making decisions. Over time, AI-powered machines can learn from data and adapt their behavior and responses, without the need for additional reprogramming.

How is AI used in hearing aids?

“For years, hearing aids have used machine learning for sound classification, which we still employ today, but in different and additional ways,” explains Achin Bhowmik, Chief Technology Officer and EVP of Engineering at Starkey who was formerly VP and GM of Intel's Perceptual Computing Group that works on machine learning, AI, and advanced sensor initiatives.

Achin Bhomik Tinyml With Genesis

Starkey's Achin Bhowmik, PhD, presents the keynote at the 2023 tinyML [Machine Learning] Summit.

AI-powered hearing aids are set to revolutionize hearing aids in both noise and quiet. Bhowmik explains that digital hearing aids typically have specific modes for different listening situations, such as “Restaurant mode” or “TV mode.” Since these settings are general, they may not always work optimally in all listening environments. Instead of using a preset program, AI hearing aids listen to the millions of sounds you hear daily and make millions of fine-tune adjustments in real-time.

Ai Hearing Aid Montage

Hearing aids that use artificial intelligence and machine learning include the latest models from global brands like (clockwise from bottom left) Oticon, Widex, Starkey, Signia, and Phonak.

AI enhances hearing aids by learning to adjust to and improve your hearing ability in various listening environments, helping to increase speech recognition and understanding. They can also use information sensors, like those that detect motion, to make split-second decisions about how to use the devices’ directional microphones and amplification parameters. With AI-powered hearing aids, you spend less time trying to sort through the noise around you, meaning you can focus more on conversation, and enjoying the communication experience.

“AI sounds futuristic and a little intimidating, but it’s not a new concept,” says Seitz-Paquette. “The groundwork for AI began in the early 1900s, and it has been embedded in our day-to-day lives for decades. However, recent advancements have catapulted AI from being an unattainable fantasy to a tangible reality for current and future generations, and today, hearing aid users are benefiting from this technology as well.”

Kevin Seitz Paquette

Kevin Seitz-Paquette, Director of the Phonak Audiology Research Center (PARC) at the Sonova Group in Chicago.

To understand the advantages of AI in hearing aids, it is first helpful to become familiar with the different technologies employed in AI hearing aids—namely, machine learning and deep neural networks (DNN). Let’s take a closer look.

Machine learning

Machine learning (ML) has been employed in hearing aids for about two decades. Machine learning is a type of AI that uses algorithms to sort through large amounts of data and make decisions or predictions. It focuses on continued learning and ongoing problem-solving capabilities. When applied to hearing aids, machine learning can capture information and learn from the hearing aid user's interactions with the devices (e.g., adjustments) and preferences about how they would like to hear in a particular environment.

A traditional hearing aid might use a specific pre-set program such as a “Restaurant setting” which may not effectively distinguish between important sounds and background noise. An AI hearing aid can enable you to customize and personalize a listening environment to your own preferences; the machine learning algorithm will learn to recognize and prioritize sounds that are important to you, such as the voices of loved ones, and work to filter out the background noise.

Deep neural network (DNN) technology

The intention of Deep Neural Network (DNN) technology in hearing aids is to imitate the function of the human brain—aiming to respond to sounds just as your brain would without the need for any specific programming. When applied to hearing aids, DNN allows the aids to begin to mimic how the user's brain would hear and interpret sound if they did not have hearing loss. In short, DNN in hearing aids is designed to provide higher sound quality, an improved signal-to-noise ratio (SNR), better listening comfort and understanding of speech, improved recall, and more.

Brain Hearing

Deep Neural Network (DNN) technology in hearing aids is designed to emulate the brain by using layers of vast amounts of data—over 12 million real-world sound samples in the case of Oticon's onboard DNN BrainHearing system—to do things like filter out noise and select the best acoustic parameters for listening in specific environments.

Some other current uses of AI in hearing aids

In addition to improved hearing in noise, AI technology in hearing aids enriches the user experience in several other ways. Examples of how hearing aids hearing aids use AI include:

  • Recognizing voices most important to you: When your hearing aids pick up the sound of your loved ones and colleagues' voices, it prioritizes them, ensuring you hear them clearly over any background noise.
  • Understanding your listening needs: If you watch a TV show regularly, your hearing aids will determine that this is a priority to you. This means, that when you sit down to watch the show, your hearing aids automatically tune into the sound, allowing you to focus on the TV over any ambient noise.
  • Tracking your fitness goals and health: Some AI hearing aids are going a step further to provide holistic wellness and fitness functionality. These health-tracking hearing aids, known as Healthables™, have integrated sensors that enable tracking of health progress, such as the number of steps you take each day, social interactions, and even heart rate. The data collected can be analyzed by AI algorithms to identify patterns and trends, helping you to stay on top of your physical and psychological health.
  • Hearing speech through a facemask: The AI in your hearing aid recognizes that you want to speak to someone, even though their speech is muffled. In real-time, it prioritizes and amplifies the voice of the person wearing the facemask, helping you to hear them more clearly.
Masked With Hearing Loss

Hearing aids can use AI to accentuate voice cues when people are speaking through a mask.

Hearing aids that use AI

Hearing aid manufacturers are utilizing AI in many different ways. Here we take a look at how some of the leading global hearing aid manufacturers are enriching the user and listening experience with smarter AI and machine-learning hearing aids.

Starkey Livio, Evolv & Genesis AI hearing aids

All Starkey hearing aids use AI to a certain extent as their automatic sound manager system utilizes machine learning, regardless of the model. “We’re able to train our devices on a variety of scenes they might encounter in their daily life: wind, speech in noise, machine noise, music, even a quiet environment,” explains Bhowmik. “These have become more adaptive over time with our new line of hearing aids, Genesis AI, delivering over 80 million sound analyses and automatic optimizations every hour. That’s nearly two billion adjustments in a day. This amazing level of computational performance is now a reality due to our all-new Neuro Processor, which includes 6 times more transistors and functions up to 4 times faster than our previous processor.”

Starkey Genesis AI

3.5 stars stars
11 reviews

Listed prices are for a pair of hearing aids in US dollars unless otherwise specified. Prices may change over time, and may vary by region.

Busy Café
With device
Quiet Office
With device

You can listen to the unaided and aided sound samples in two different listening situations (busy cafe and quiet office) by clicking on the red play-buttons for the hearing aids included on this page. To really hear the difference, it's best to use quality headphones or earbuds. Sound samples courtesy of the independent audio testing lab, HearAdvisor.

Starkey’s new Neuro Processor includes a built-in DNN hardware accelerator engine that helps analyze sound, enhance speech, and reduce noise just as the healthy human cerebral cortex does. “What AI, and specifically DNN, does is a data-based approach for the device to become smarter and deal with challenges that are thrown at it from an acoustic viewpoint,” continues Bhowmik. “To instantaneously improve speech understanding and decrease listening effort in a challenging listening environment— such as a noisy restaurant—patients can activate “Edge Mode+” with a simple double tap of the device or through the App. This leverages the hearing aids’ built-in DNN accelerator to scan the environment and optimize sound for speech clarity. Our in-app feature Voice AI also harnesses the power of the iPhone’s real-time DNN computing power to act as a personal remote microphone for people with moderately severe or beyond hearing loss.”

Starkey was the first hearing aid manufacturer to integrate 3D inertial measurement unit (IMU) sensors in hearing aids, beginning in 2018 with Livio Edge AI, and later with Evolv AI, to their latest generation of products, Genesis AI. The Starkey Thrive Hearing App is compatible with Evolv AI and Livio hearing aids, whereas Genesis AI is controlled using the MyStarkey app. The accelerometer and gyroscope in their wireless hearing technology can track steps and physical activity and are currently the only hearing aids to detect and report falls.

HearingTracker Audiologist Matthew Allsop gives you an overview of Starkey Genesis AI hearing aids shortly after the devices were launched in February 2023. Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.

Furthermore, their AI hearing technology can respond to tap controls, perform a Self-Check assessment, and using connectivity to the cloud, transcribe conversations, translate up to 70 languages, set reminders, and act as a smart assistant. All of which require artificial intelligence to function.

Starkey is trying to change the way people see and use hearing aids. By transforming them from single-function listening enhancement devices into multi-functional and multipurpose health and communication tools. Hearing plays a significant role in our emotional well-being and physical health and untreated hearing loss has been linked to a greater risk of dementia, higher risk of depression, and increased risk of falls.

“Treating hearing loss is the number-one modifiable risk factor for the prevention of dementia, and is associated with delayed diagnosis of Alzheimer’s disease and all-cause dementia, depression, anxiety, and injurious falls,” says Bhowmik. “With the recent results of the groundbreaking Aging and Cognitive Health Evaluation in Elders (ACHIEVE) study drawing a link between the use of hearing aids and slowing cognitive decline, it is more important now than ever to think about a patient's overall health and wellness.”

Phonak Paradise & Lumity hearing aids

Phonak has been using machine learning for over 20 years to classify acoustic environments, which is the backbone of what is known today as the automatic operating system, AutoSense OS, used in all models of Phonak hearing aids.

Phonak Lumity Stereozoomusecase 1200x675

The AutoSense OS AI-based system in Phonak Lumity hearing aids is able to recognize and react to the location of sounds, meaning users don't have to worry about manually changing programs to hear better in noise.

AutoSense OS has been trained with AI-based machine learning to improve sound recognition by constantly scanning and analyzing the sound environment of everyday listening situations. It then blends feature elements from 200 different settings in real-time to provide an optimal listening experience. AutoSense OS automatically adjusts and adapts to your surroundings as they change, and the enhanced identification of different listening situations and automatic blending provides a more natural hearing experience, meaning the user doesn't have to worry about manually changing programs. “Phonak Lumity further expanded AutoSense OS to have the ability to recognize and react to the direction of arrival of the primary speech source,” explains Seitz-Paquette.

Phonak Audéo Lumity

3.5 stars stars
7 reviews

Listed prices are for a pair of hearing aids in US dollars unless otherwise specified. Prices may change over time, and may vary by region.

Busy Café
With device
Quiet Office
With device

Phonak has found that the use of AutoSense OS, versus the use of manual programs, improves speech understanding by 20%. This occurs when listening to speech in challenging listening environments, such as in noise, loud noise, and in a car.

“Hearing aid users want a seamless and intuitive listening experience,” says Seitz-Paquette. “Although modern hearing aids can provide great benefits to the user across a wide variety of listening situations, the exact needs within each of those environments may be different. AI in hearing aids is one way to allow the user to benefit from the best signal processing for any environment, without requiring the additional effort and hassle of managing multiple dedicated programs.”

This video presents a comprehensive overview Phonak Lumity hearing aid technology presented by HearingTracker Audiologist Matthew Allsop (AI-related features covered from 6:30-18:00). Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.

“Over time, AutoSense OS has become more and more sophisticated.  For example, Phonak Paradise saw the introduction of Motion Sensor Hearing, where AutoSense OS gained the ability to take the patient’s physical activity into consideration when adapting the directional microphone response,” says Seitz-Paquette.

All newer Phonak hearing aids that connect to the myPhonak app have some level of health data tracking, including Audéo L, L-Life, and Slim Lumity on the latest Phonak Lumity platform. In the previous Phonak Paradise product family, most hearing aids have health data tracking, via the myPhonak app, including Audéo P, Audéo P-Fit and P-Life, Naida P, and Virto P-312. The app enables wearers to track data, including step count, activity levels, distance walked or run, wear time, and in the case of the Phonak Audéo P-Fit even your heart rate.

Oticon Intent, Real, Own & More hearing aids

According to Virginia Ramachandran, AuD, PhD, Head of Audiology at Oticon USA, the most sophisticated form of AI used in Oticon hearing aids is the deep learning, which occurs via the DNN embedded in the company's Sirius™ chip platform that powers Oticon Intent, the Polaris R™ platform in Oticon Real™, and the Polaris™ platform in Oticon Own™, Oticon Play PX™, and Oticon More™ hearing aids.

Virginia Ramachandran

Virginia Ramachandran, AuD, PhD.

Since a DNN is modeled on the way that the brain naturally learns, this is consistent with Oticon’s BrainHearing philosophy of supporting the brain in how it makes sense of sound.

Oticon More™ was the first hearing aid with an on-board DNN, and each successive generation of product families have built upon this, giving wearers access to all relevant sounds, in balance, with exceptional detail and clarity.

Oticon Intent

5 stars stars
1 review

Listed prices are for a pair of hearing aids in US dollars unless otherwise specified. Prices may change over time, and may vary by region.

To help enable wearers to follow and engage in conversations in complicated listening environments such as groups, crowds, airports, and restaurants, the DNN in the Oticon Polaris and Polaris R platforms was trained to filter out noise from the sound scene. The training was accomplished using 12 million real-world sound samples. Moreover, Oticon Intent expanded on this massive number of sound samples to include even more diverse listening situations. The DNN comprehensively scans the sound scene and then organizes and delivers the sound, making sure the output is clean and balanced to the person's type of hearing loss.

This system highlights the sounds that are likely of interest to you, creating a contrast with the sounds that are of less interest to you, such as background noise. The idea behind it is that instead of using a sound processor built upon theory, your hearing is enhanced in a more natural way.

“The result is that relevant sounds are accessible, clear, comfortable, and audible,” explains Ramachandran. “Our research using state-of-the-art neuroscience, cognition outcomes, as well as more classic audiology research methods have demonstrated that this benefits the user by providing improved sound quality, greater access to speech, a clearer representation of the full sound scene in the brain, better speech understanding in noise, and greater long-term memory recall of speech.”

The DNN is engaged in a feature called MoreSound Intelligence™. This feature's settings can be adjusted by the hearing provider in their Genie2™ software so you can experience an optimized fitting and a full sound scene with clear contrast and balance.

Oticon Real also includes new technologies that balance sudden disruptive sounds and clean-up wind and handling noise, further enhancing the listening experience. Oticon Intent goes considerably further: its 4D Sensor technology is reportedly the world's first hearing aid that can adapt its sound settings based on conversational activity, head and body movement, and environmental noise—in other words, a hearing aid that more accurately surmises your listening intent.

HearingTracker Audiologist Matthew Allsop provides an overview of the Oticon Intent hearing aid. Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.

Widex EVOKE & MOMENT, and Signia AX & IX hearing aids from WSA

Hearing aid manufacturer WS Audiology (WSA) which develops, manufactures, sells, and distributes hearing aids under the brands Widex and Signia among others, has been using AI in their products for the past 10 years.

“At WS Audiology, we use advanced AI applications to enable adjustment of hearing-aid settings based on the individual wearer's preferences and the specific sound environments they encounter, to increase the personalization of the hearing aids and thereby the wearer satisfaction. Importantly, our AI applications continuously learn from the real-life data they gather, allowing their performance to be improved over time,” according to Jens Brehm Nielsen, Head of AI Accelerator at WS Audiology.

Jens Brehm Nielsen

Jens Brehm Nielsen, PhD.

For example, Widex MOMENT hearing aids utilize AI and machine learning to create hearing programs based on a wearer's typical environments, such as their workplace, home, or favorite restaurant. Embedded in Widex's MOMENT and EVOKE apps, MySound 2.0 (previously SoundSense Learn) uses machine learning AI to guide the user to a more personalized listening experience in real-life situations. This technology helps the user control how they hear by involving them in the process of determining the best hearing aid settings tailored to their listening needs.

Widex MOMENT

3 stars stars
13 reviews

Listed prices are for a pair of hearing aids in US dollars unless otherwise specified. Prices may change over time, and may vary by region.

Busy Café
With device
Quiet Office
With device

But how does it work? The user is presented with two alternative settings to choose from according to which sound is the most appropriate for them in a specific environment. With time, the machine learning algorithm learns to automatically adjust to the user's preferences when you are in similar environments. This means the hearing aid user can have more direct control over what they want to hear.

“Hearing aids are incredibly individual and wearer preferences can change over time,” explains Nielsen. “Having an AI companion that learns from you and can adapt with you is a very helpful tool. AI’s main benefit to the wearer is providing personalized optimizations in real-time. This allows for less travel to the clinic, less time without your hearing aids as you have help with simple troubleshooting, increased retention of hearing aids due to a more optimized sound, and more time to talk to the hearing provider about your challenges and expectations.”

The most recent Signia hearing aids with AI-driven features are the Signia IX (Integrated Xperience) family in October 2023. Signia claims this is the first hearing aid platform capable of pinpointing multiple speakers in real time, enabling unprecedented sound clarity in group conversations.

Signia hearing aids are also known for their health-tracking properties, originally launched in 2021. Examples include the Signia Insio AX in-the-ear (ITE), Signia Pure Charge&Go IX, and Styletto IX.

Signia Styletto IX

0 reviews

Listed prices are for a pair of hearing aids in US dollars unless otherwise specified. Prices may change over time, and may vary by region.

Several features can be accessed by the wearer with the My WellBeing app. This app helps you to stay on top of your physical and hearing performance. It uses sensors in the Signia AX and IX hearing aids for a range of health measurements, including step counting, physical activity, wear time, and how much the hearing aid user socially interacts with others.

In addition to My WellBeing, the Signia app has another helpful feature called Signia Assistant. “Signia Assistant is a great example of how machine learning can be applied to hearing aids to improve the lives of those who wear them,” explains Signia's Senior Director of Audiology Brian Taylor, AuD. “Signia Assistant uses anonymized data from thousands of other wearers to help individuals fine-tune their devices using Signia Assistant. As more data is fed into the system, it becomes better at making precise fine-tuning adjustments for the individual. We have data suggesting that Signia Assistant contributes to greater wearer acceptance, mainly because wearers are empowered to make their own real-time adjustments through the app.”

Brian Taylor

Brian Taylor, AuD.

Do I need a hearing aid with AI?

It’s clear that hearing aid technology is enhancing the user's listening experience and is continually developing. However, not everyone may feel they need AI-powered additional features. For example, people who live alone or rarely socialize might not benefit from features that improve listening in background noise. Whereas, for someone who enjoys socializing or whose work or hobbies involves communication or background noise, AI-powered features can allow for an improved hearing experience. Likewise, some people may feel like they would benefit from health tracking and fall detection features, whereas others may not see this as an essential hearing aid feature.

What hearing aids are most suitable to your needs is individual to you, and this is something that is best to discuss with your audiologist or hearing provider. Together you can determine exactly what you want from your hearing aids and choose the one that's best for you.

Future predictions about AI in hearing aids

So, what does the future have in store for AI in hearing aids? Anyone who has used ChatGPT or similar AI-enabled websites and apps knows that AI can be implemented in powerful ways that almost seem like magic. For example, Google Pixel 8 Pro can now manipulate images and audio using AI previously reserved for only expensive studio software (e.g., see 13:55 of this video review). It seems almost certain that AI-assisted voice recognition and noise separation will continue to improve audio quality and speech understanding in noise.

It's also likely AI will gain from more health and fitness tracking and sensors, providing a wider range of on-demand real-time data about how you're feeling and revealing more about bodily functions. Smart glasses with amplification and captioning glasses you can wear all day may also eventually use AI to combine visual cues, sensors, and sound to further empower people with hearing impairment. HearingTracker recently published a product review on this exciting emerging technology.

In a MedWatch article that discussed the competition between future products in the hearing aid market, GN Hearing former CEO Gitte Aabo commented, “I believe that the biggest impact in the next 10-15 years will come from machine learning, which to an even greater extent than today can detect the sound environment you are in and adjust accordingly.” However, not all hearing industry executives agree that AI will necessarily revolutionize hearing healthcare.

Seitz-Paquette further adds, “AI has much more to offer within the field of audiology than what we’ve seen today. AI will provide more tools to be used in hearing aid signal processing that will further improve the listening experience of the end user. AI will also unlock new approaches to hearing aid fitting, making it easier for the hearing care professional to achieve a satisfactory setting for the patient. One thing is for certain: AI will never be able to replace the unique human elements a professional provides such as emotional intelligence and the crucial role they play in caring for and counseling their patients.”

Carly

Hearing Health Writer

Carly Sygrove is a hearing loss coach and a hearing health writer who has single-sided deafness. She writes about living with hearing loss at My Hearing Loss Story and manages an online support group for people with hearing loss. She is also the founder of the Sudden Hearing Loss Support website, a source of information and support for people affected by sudden hearing loss.