BYO Headphones: Tuning into the Next-Gen Bluetooth Broadcast at the Sydney Opera House
In the near future, everyone will have the ability to tune into Bluetooth Auracast broadcasts—with personalized audio—in theaters, cinemas, airports, and more.)
Why is everyone using captions?
At home, my wife and I watch TV with the captions on, almost always. We're both in our 40s and neither of us has a hearing loss. I blame the issue on our TV being about 15 feet from the couch, but I think there's something else going on. Today's ubiquitous flat-screen TVs are notorious for producing unclear sound through tiny speakers—often pointing in the wrong direction—and modern sound design seems to prioritize immersive sound environments (effects, music, explosions) over dialogue clarity. To really hear every word, we'd have to turn the TV up loud enough to wake up the kids.
But we're not alone. A 2021 StageTEXT survey found that 80% of 18-24 year olds "use subtitles some or all of the time watching TV on any device," and only 10% of those surveyed were deaf or hard of hearing. Interestingly, this trend seems to be most pronounced in Gen Z. A more recent survey by Preply asked 1,200 Americans whether they use captions "most of the time." Here's what they found:
)
The Preply survey was interesting because it also asked participants why they used subtitles. Nearly 75% reported issues with unclear audio, while 61% reported using subtitles for better understanding accents. Additionally, 29% preferred subtitles to avoid disturbing others at home, and 27% used them to stay focused amid distractions.
When captions aren't available...
If you're like me, you've come to rely on captions to catch 100% of the TV dialogue. And this can make it all the more frustrating when captions aren't available. We've all been there; at the movies and the speech is muddled by background sounds, or at a public speaking event—and you're just a little too far back to make out every word. It's so frustrating to miss out on what's being said, especially when we've all become so spoiled by the high-quality captions offered by Netflix and the other streamers.
Getting some help from Bluetooth's new broadcasting technology—Auracast™
While it may not give you 100% speech recognition, there's a brand new technology being launched by Bluetooth® that makes captions a lot less necessary for people like myself, who just need a little help. Bluetooth's new Auracast technology allows you to tune into local audio broadcasts as easily as you would tune into a radio station on your car stereo. You see a channel name, like "Cinema 7", you tap on it, and you suddenly have access to the audio from the movie, directly through your own wireless earbuds.
I had a chance to tune into my first Auracast channel on Wednesday at the Sydney Opera House, where I was invited to bring my own Auracast-enabled headphones to get a sneak peek at the future of Bluetooth audio.
)
This was a huge milestone for me. We've been covering the development of Auracast and the promise of the new technology for people with hearing loss since at least 2022. It was beginning to feel like the day would never come I'd be able to tune into a broadcast with my own earbuds in the real world.
)
At home, when I'm the only one up, I often use my earbuds to hear the TV clearly without waking up the family (via Bluetooth streaming from my Apple TV box). This was exactly the same experience: Crisp, clean sound direct to my eardrums—even while everyone around me continued socializing while waiting for the main event. I could make out every word, even without reading the captions on the screen.
Beyond words: Music sounds better too.
Inside the Sydney Opera House's Drama Theatre, we were invited to tune into the main Auracast broadcast. This time, I was asked to tune in using an Auri receiver and pair of SteelSeries headphones. As an accessibility initiative, to bring better hearing to those with hearing loss, the Sydney Opera House has installed Auracast transmitters, but also provides receivers and headphones for those without compatible earbuds.
)
The sound from the stage was made available via another Auri transmitter, presumably somewhere in the theater (they offer a range of 100 meters or 328 ft). The receiver was already tuned into the "Drama Theatre" channel and I only had to press one button to start the stream. I fiddled with the volume a little to get it just right.
After a couple of good speeches, which were made crystal clear by the broadcast stream, I experienced live music for the first time via Auracast—courtesy Celeste Strings (pictured below). The sound was full and sharp, and when I occasionally removed the headphones to let my ears breathe, I noticed how much less nuance I could hear in the music.
)
Over ten years in the making
While I'd been waiting impatiently since 2022 to experience Auracast in person, I learned that—behind the scenes—some people have been waiting a lot longer. In his address to the audience at the Sydney Opera House, Peter Karlstromer, the CEO of GN (the company hosting the event) said, "We actually have some people in the room that have been working on this for 10-plus years."
)
As a company, GN has been around since the very beginning too. Recognizing the limitations of proprietary audio streaming protocols early on, GN advocated for an industry-standard solution to enhance accessibility and interoperability. Their collaboration with the Bluetooth Special Interest Group (SIG) was instrumental in developing Bluetooth LE Audio, the technology underlying Auracast broadcast audio, primarily to help address the needs of people with hearing loss (GN is a hearing aid company first and foremost. Their flagship brand is ReSound).
What does the future look like for those with hearing loss?
If you live with hearing loss in 2025, you're likely still using your hearing aid's onboard telecoil to tune into public audio broadcasts through induction hearing loops. But then, you might be one of the many people who don't have access to local hearing loops (they are expensive to install and maintain), or maybe your hearing aids don't have an onboard telecoil (very few people are informed of the benefits of telecoils when purchasing hearing aids).
In the next few years, or maybe sooner, I see all of this changing. Auracast transmitters are far cheaper and easier to install (the Sydney Opera House said it took literal "minutes" in their case). Not only that, the sound quality Auracast delivers is far superior to the sound provided by hearing loops offering both deeper bass and higher treble tones. And unlike loops, there are no "dead spots" to deal with or interference from other electromagnetic sources.
I also see earbuds and hearing aids of the future being almost universally Auracast compatible. Most of today's hearing aids are already Auracast-ready (via a future firmware upgrade), and some, like ReSound's Vivia hearing aids, are already capable of tuning into live Auracast streams. Many in the Drama Theatre were in fact using their ReSound hearing aids with Auracast, so these hearing aid users will see their world expand as more and more venues begin to install transmitters like the Auri.
To learn more about the promise of Auracast for hearing loss, check out the video below, and be sure to read Auracast in Hearing Aids and Hearables: Bluetooth LE and the New Revolution in Connectivity.
Auracast is a HUGE deal for headphones and hearing aids because it allows them to directly receive high-quality audio broadcasts from compatible sources, such as TVs or public announcement systems, without background noise interference. This technology enhances the listening experience, offering clearer sound and improved accessibility in various public environments. Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.
How are big tech companies like Google and Apple helping to usher in a more hearing-accessible future?
After the main event, I had a chance to catch up with Thomas Girardier, Google's Auracast tech lead. He gave me a preview of some of the Auracast features coming to Android 16 beta.
)
In the example I showed earlier in the article, where I selected the "Auri TV Stream" for my Creative earbuds, I had to go through the Creative app to select the Auracast stream. With Android 16, Android will introduce its own Auracast broadcast selector, meaning Auracast will be more baked into the operating system. This will introduce more people to the technology, and will provide for a more consistent experience when using different brands and models of earbuds and hearing aids.
He also demoed a new feature called "Scan with Pixel", which lets anyone with a Pixel 9 phone scan a QR code to quickly connect to an Auracast stream. In the demo, Thomas' phone connected to the audio stream for passenger announcements at a fictitious train station. I think this will be a great way to introduce more people to Auracast, and I could see this being incredibly useful when the "available broadcasts" lists start looking like the long lists of available WiFi networks we've all become accustomed to.
While we haven't seen any such announcement yet from Apple, Deaf freelance journalist and disability advocate Liam O'Dell reports that Apple's director of accessibility, Sarah Herrlinger, is "super excited" to see how Auracast "can be implemented" in Apple's technology.
When Auracast isn't enough...
For those with severe to profound hearing loss, or auditory processing issues, loud and clear sound may not be enough. In these cases, additional accommodations, like captions or sign language interpretation, can help. In the U.S., the Americans with Disabilities Act (ADA) addresses this need by requiring movie theaters to provide closed captioning devices upon request. While open captions—displayed directly on the screen—aren’t explicitly mandated, many theaters voluntarily offer open-captioned screenings, further enhancing accessibility.
Live performance theaters, including Broadway, community theaters, and concert halls, also fall under ADA guidelines. These venues must provide "effective communication" to patrons who are deaf or hard of hearing. Depending on the event, the venue's resources, and individual needs, accommodations might include handheld captioning devices, real-time open captioning screens, American Sign Language interpreters, or assistive listening devices. Although theaters have some flexibility in determining which auxiliary aids to offer, the ADA generally requires them to meet the needs of their audience unless doing so would impose an undue financial or operational burden.
How Artificial Intelligence is revolutionizing accessibility
Beyond traditional captioning methods, a burgeoning category of AI-driven captioning glasses is poised to revolutionize hearing accessibility. Augmented reality (AR) captioning glasses are increasingly lightweight and aesthetically appealing, making real-time captioning a practical part of everyday life. By projecting live transcriptions directly into the wearer's field of vision, these glasses empower individuals to engage confidently in conversations, meetings, lectures, or even social gatherings without missing key information or context.
)
Alongside these wearable devices, smartphone apps such as Google's Recorder, Otter.ai, and Ava have made significant advancements in real-time transcription. Google's Recorder, powered by Gemini Nano—a powerful on-device AI—enables instant and fairly accurate (see my notes below) audio transcription, summarization, and easy searching of recorded conversations. While not as convenient as projecting captions into your field of view, apps like Recorder will get you by in a pinch.
)
I tested out the AI transcription built into my Pixel 8 phone during the Sydney event. It was very fast, but misheard a few words ("YouTube" should be "Bluetooth", for example). The words in the screenshot above were spoken by Ingrid Dahl-Madsen, the Danish Ambassador to Australia. I'd always wondered where the weird name (Bluetooth) came from, and if you're now curious too, I'll go ahead and close this article out with a little more detail on the history.
Where "Bluetooth" got its name
Bluetooth got its name from a Viking king named Harald "Bluetooth" Gormsson, who ruled Denmark and Norway in the 10th century. King Harald was known for uniting various tribes into a single kingdom—similar to how Bluetooth technology unites different devices to communicate seamlessly.
In 1996, engineers at Intel, Ericsson, Nokia, and IBM were working together to create a wireless standard. Jim Kardach, an Intel engineer and history enthusiast, suggested the temporary codename "Bluetooth" after reading about King Harald’s unification of Scandinavia, thinking it fittingly symbolized their goal of uniting devices and industries.
Although initially intended as a temporary internal codename, "Bluetooth" stuck due to legal issues with alternative names. And a fun fact: The Bluetooth logo itself combines the runic symbols for King Harald’s initials (ᚼ [Hagall] and ᛒ [Bjarkan]).
Abram Bailey, AuD
Founder and PresidentDr. Bailey is a leading expert on consumer technology in the audiology industry. He is a staunch advocate for patient-centered hearing care and audiological best practices, and welcomes any technological innovation that improves access to quality hearing outcomes. Dr. Bailey holds an Au.D. from Vanderbilt University Medical Center.