Have you ever struggled to communicate in social situations—bars, restaurants, work meetings—due to hearing loss? Wouldn't it be wonderful if real life came with captions, so you could see a transcript of conversations in real-time right before your eyes? Well, this is no longer a futuristic notion, but rather something that is being made possible with augmented reality (AR) glasses.

In today's world, there are automated captions everywhere. We see them on YouTube, Social media platforms, and video communication services like Google Meet, Zoom, and FaceTime. Closed captioning can make a huge difference for people with hearing loss who want to follow a TV show or understand a video call.

Easier Everyday Communication Through the Power of AR

AR is an enhanced version of a real-life environment using digital visual elements, sounds, or other sensory stimuli. Live-captioning glasses use AR to project onto your glasses, inserting a discreet captioning service into your real life, in real time.

AR captioning glasses can assist you in your everyday communications, meaning you no longer need to rely on body language and speechreading to fill in any missing keywords in conversation and can enjoy communication with relative ease. You simply pop on the captioning glasses and can “see” the conversation happening in your field of vision—just like watching television with captions.

Although captioning glasses have been around for a few years now, initial prototypes were aesthetically “chunky” and were not comfortable to wear for long periods. Increasingly, they are becoming more lightweight, and some even boast new innovative features beyond captioning. This means they can be worn for a whole host of different situations, such as conversations, the cinema, theater, lectures, meetings, or when you’re at home watching TV.

Here’s a quick summary table of some of the best and most promising consumer products in the growing field of captioning glasses, followed by more detailed information about each one.

We'll also continue to update this page as new companies and models in the market appear—so bookmark this page!

XRAI GlassXander GlassesLEION Hey ARTranscribe GlassHearsightCaptify
Weight in oz (grams)<1.4 oz (40 g)*4.6 oz (130 g)1.4 oz (40 g)****1.3 oz (36 g)1.3 oz (36 g) 1.5 oz (43 g)
Battery life transcribing (up to...)8 hrs2-3 hrs2 hrs (AR 2: 8 hrs)6-8 hrs12 hrs (Engo 2)4 hrs
Live speech-to-text captioning
Translation
Removable shades ✓**✓††
Works w/ prescription lenses
AI Assistant / ChatGTP✓*****
Conversation recorderIn future
Wireless phone connection✓**Not required
Android & iOS compatible✓***iOS only******
Available for ordering
Price$880 + app sub$5,000$800†$377 + $20/mth$300 + plan$499

A quick reference table of the features found in the AR captioning glasses detailed below. *Estimated weight for new AR 2 model; **In AR One and 2 models; ***Free app only needed to adjust caption settings. Glasses work independently; ****Estimated weight for Leion Hey2; *****Beta version only; ******Android version expected soon; †Estimated retail price for 1st generation model ††† Photochromic option

A Note on the Hardware Behind Captioning Glasses

When it comes to captioning smart glasses, many companies don’t build the hardware themselves. Instead, they adapt AR glasses from established manufacturers—originally designed for entertainment or productivity—and layer on their own software for real-time speech-to-text. Brands like Vuzix, XREAL, LLVISION, MICROOLED, and MYVU, mentioned in this article, offer lightweight, high-performance platforms that serve as the foundation for these accessibility tools. This approach lets developers focus on their software without needing to reinvent the hardware—a pattern you’ll see throughout this article.

Next, we’ll take a closer look at some of the companies pioneering captioning glasses technology and how they’re shaping the future of accessible communication.

Hearing Tracker Audiologist Matthew Allsop provides a review of XRAI Glass and its app. Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.

XRAI Glass

$880 (presale price of $750) + app subscription

XRAI is a US-based company founded by Dan Scarfe, who has recently been joined in running the business by his wife Lara. The XRAI name (pronounced x-ray) combines XR (mixed reality) and AI (artificial intelligence). The idea for the XRAI Glass AR smart glasses began when Scarfe noticed his 96-year-old grandfather struggling to communicate across the dinner table one Christmas Day due to hearing loss. That's when Scarfe started his mission to “Break down communication barriers and promote communication between everyone.”

Smart glasses compatible with XRAI Glass include (clockwise from top left) XRAI AR2, RayNeo X3 Pro (coming soon), QONOQ Mirza, and XReal One.
Smart glasses compatible with XRAI Glass include (clockwise from top left) XRAI AR2, RayNeo X3 Pro (coming soon), QONOQ Mirza, and XReal One.

The XRAI Glass app, which is available for Android and iOS, works both with compatible smart glasses and without. This means you can benefit from live captioning of speech either through captioning glasses that use the XRAI software or simply by using the XRAI app on your phone or tablet, which you can download for free. Subscribing to a paid plan unlocks additional features like speaker ID, extended conversation history, real-time translation, and AI-powered assistance.

The glasses are $880, but are currently available on presale for $750, with a $250 deposit, says Scarfe. The app is currently listed at $15 or $30 per month, depending on whether you opt for the Premium or Ultimate packages, which offer 600 minutes (10 hours) or 1,800+ minutes (30 hours) of advanced features like “cloud-enhanced" transcription and translation. There is a free app option that offers offline subtitling with reduced accuracy and 20 languages.

The first version of XRAI software launched in November 2022, supporting the XRAI Glass captioning glasses. The latest update now supports several styles, including the wireless XRAI AR One. These glasses use the LLVISION G35 device, the same hardware found in Leion Hey glasses (detailed below).

Directional microphones on the glasses pick up the speaker’s voice with minimal delay, projecting live transcripts inside the glasses, which only the wearer can see. The app’s AI distinguishes and labels multiple speakers when talking simultaneously.

As this technology continues to advance rapidly, the XRAI AR One glasses are already set to be surpassed by the next-generation XRAI AR 2, based on LLVISION’s new G36 device.

The AR 2 will be available for pre-sale in June 2025 and is expected to ship in August. “Our design goal with this was to make them look like a normal set of glasses and make sure that they're suitable for all-day use,” Scarfe told HearingTracker.

Life. Subtitled. A XRAI-glass produced video explaining their product.

The AR 2 will offer up to 8 hours of use and takes a hybrid approach: self-contained and wireless, but offloading processing to a paired smartphone via Bluetooth. This balances the weight of fully standalone glasses and the limits of tethered designs, keeping them under 40g, comfortable, and high performing.

XRAI Glass works with Lensology for prescription lens compatibility; you send your style and prescription to Lensology, which mails you a custom insert. But unlike earlier models where you clip the lens insert in, the new AR 2s will have the prescription lenses built straight into the frames for a much more seamless fit.

While fully wireless glasses are gaining traction, tethered options like XREALs still play an important role in the market. The XREAL Air 2 and Air 2 Pro (72g and 75g) connect via a wired cable for reliable, low-latency performance. Scarfe told HearingTracker, “If people want multipurpose glasses that they can use for lots of things, including subtitles, XREAL is by far the best choice.” Though not solely for captioning, XREAL glasses work well with the XRAI app and are a popular, more affordable option.

XRAI software also offers:

  • Real-time translation for 223 languages.
  • A conversation recorder that saves transcripts for later. (Users control the data, and XRAI does not access it, operating under a strict privacy policy.)
  • Support from AI engines like Microsoft Azure, Amazon Web Services (AWS), Deepgram, VOSK, and OpenAI’s ChatGPT for accurate transcription and smart responses.

XRAI software goes far beyond traditional captioning glasses with the launch of its XRAI Stream platform. It delivers shared audio experiences such as live captions and translations over Bluetooth, without needing special hardware. Similar in spirit to Bluetooth LE Audio’s Auracast, XRAI Stream is designed for broad accessibility, making it ideal for deployment in theaters and public venues. The system captures audio directly at the source (e.g., a mixing desk) and streams real-time subtitles to smart glasses and other devices in up to 223 languages.

To showcase this, XRAI is partnering with the Dutch National Theatre and audio tech company Dante to pioneer live subtitling of theatre performances, enhancing accessibility for deaf, hard-of-hearing, and non-Dutch-speaking audiences. Broader rollouts of XRAI Stream are expected later this year or early next year.

XRAI has also secured funding from the UK’s National Centre for Accessible Transport to bring its tech to buses and trains, allowing passengers to read driver announcements in their own language.

XRAI Glass has received numerous awards, including the Most Innovative Hearing Aid Solutions App at the SME IT Awards (2024), Best Captioning Tech at the Hearing Tech Innovator Awards (2023), and the Technology For Good Award at the Global Business Tech Awards (2023).

XanderGlasses

Price per pair is $5,000 and includes everything needed to use them out of the box, with customization based on individual needs.

Xander is a team of audio experts, engineers, and researchers, led by co-founders Alex Westner and Marilyn Morgan Westner. After earning an MS from the MIT Media Lab for acoustics and audio processing, Alex Westner spent 20 years in the audio industry, working with industry leaders Gibson, Mitsubishi Research Labs, and iZotope. Diagnosed with a vision condition, he became keenly interested in sensory substitution and “the most profound human problem related to audio”: helping people who have trouble hearing.

HearingTracker Audiologist Matthew Allsop stops by to chat and get a demonstration from Xander Cofounder Alex Westner during the 2025 Consumer Electronics Show (CES 2025).

They partnered with Vuzix, using a modified version of the Vuzix Shield model, semi-customized for their product: XanderGlasses designed as an assistive device for people with hearing loss and communication difficulties.

XanderGlasses are standalone smart glasses, operating independently without the need for a connected smartphone or external device. They have an ergonomic, cushioned design and work right out of the box. Just press a button and wear them. Multiple noise-canceling microphones capture nearby speech and show real-time captions on both lenses.

XanderGlasses offer built-in speech-to-text that works offline, providing reliable captions without the need for Wi-Fi or cloud services, though Wi-Fi can be optionally used to enhance accuracy. They never store conversations and adhere to strict SOC 2 privacy standards.

XanderGlasses powered by Vuzix display real-time captions of what other people are saying.
XanderGlasses powered by Vuzix display real-time captions of what other people are saying.

Testing shows that XanderGlasses achieve 85% to 95% accuracy offline, depending on ambient noise levels. When connected to Wi-Fi, accuracy can improve to as high as 97%. The glasses support 26 built-in languages, and with a Wi-Fi connection, you can access up to 140 languages, with real-time translation between them. HearingTracker editor Karl Strom and video content producer/audiologist Matthew Allsop were both impressed by XanderGlasses at the 2025 Consumer Electronics Show (CES 2025), where they noted the live-captioning worked well even in a very loud and reverberant convention hall.

Weighing 130 grams, the glasses are heavier than other options on the market due to their built-in technology. The company is working to make future versions lighter.

XanderGlass comes with a free app that works on both iPhone and Android. You can use the app to adjust the position, size, and brightness of the captions. Soon, users will be able to change fonts, including one optimized for dyslexia. The app is optional as the glasses work independently.

The bad news is that the glasses have a limited battery life. In offline mode, they offer about 2.5 to 3 hours of continuous captioning. However, when connected to the cloud, early estimates suggest the battery life could potentially double, though this is still being tested. Additionally, putting the glasses into sleep mode when not in use can extend battery life up to 4 hours. The good news is that the glasses take only about 2 hours to fully charge via a USB-C port.

See how the translation feature works and looks from the user's perspective, as demonstrated by Xander's Alex Westner and Marilyn Morgan Westner.

The glasses come with a range of accessories, including a sunglasses clip-on, a wireless microphone and receiver, charging cables and a power adapter, and prescription lenses, if needed. Xander also offers a call to anybody who buys the glasses if they have questions at any time. 

Though the price of XanderGlasses is comparable to an average pair of prescription hearing aids, the company is committed to reducing the cost of their product to make it more accessible. They are currently partnering with schools, employers, state assistive technology centers, and vocational rehab programs, and donating glasses for fundraisers, including the Hearing Loss Association of America’s (HLAA) Walk for Hearing.

After launching in Boston in August 2023, XanderGlasses went on sale in November 2023. They are currently available in the US and plan to fulfill orders in Canada and the UK by late 2025.

The technology is just one part of the company’s broader mission and values. Over the past two years, the team has worked closely with individual Department of Veterans Affairs (VA) Medical Centers and audiologists to test and refine the glasses. Their mission has been to support veterans who are hard of hearing or have difficulty understanding speech, helping them regain confidence in communication and a sense of connection with family, friends, and their communities. As a result, XanderGlasses have been specified through the VA since September 2024 as an approved intervention for a range of communication challenges.

“We're very proud to work with our veterans who are disproportionately affected by hearing loss, and this past October, we were nationally vetted and approved by the VA, so that any VA medical center or clinic in the country can buy glasses for veterans who qualify. So, veterans can get Xander glasses for free in the US, as long as they work with their audiology team,” explains Alex Westner.

XanderGlasses have earned notable recognition, winning the CES 2025 Innovation Award Honoree for Accessibility & Aging, the 2025 TWICE Picks Award for innovation in practical consumer technology, the 2024 Hearing Technology Innovation Award for Assistive Technology & Software, the AccessABILITY Award 2024 from Reviewed (USA Today’s review publication), and the CES 2024 Innovation Award Honoree for Accessibility & Aging.

LEION Hey AR glasses.
LEION Hey AR glasses.

LEION Hey AR Glasses

First-generation model available by request through the manufacturer, LLVISION, or via authorized retailers. Pricing typically averages around $800, although it may vary depending on the model and region.

Leion Hey AR Glasses by Beijing LLVISION Technology Co., Ltd. are designed to support and empower individuals with hearing loss by providing real-time transcription, translation, and integrated AI chat functionality. These wireless AR glasses combine artificial intelligence with augmented reality to deliver speech-to-text captions on both lenses.

They offer up to 2 hours of battery life under typical conditions (50% screen brightness, good signal, and moderate usage), support simultaneous charging via USB-C, and can be powered by an external battery pack.

The glasses use dual microphones with noise cancellation to enhance speech recognition accuracy. Under quiet conditions, the system achieves over 92% accuracy within a one-meter range. In noisier places, like an 80 dB restaurant, accuracy remains above 82%.

Leion Hey glasses support three main modes:

  • Conversation Mode: Live captioning of speech with stored transcripts for later review.
  • Translate Mode: Real-time translation in up to 91 languages.
  • ChatGPT Integration: Built-in AI assistant providing answers directly on the display.

Compatible with Android 10+ and iOS 11+, the glasses connect to your phone through the companion app via Bluetooth and Wi-Fi. Initial setup involves user registration and pairing the device, which only needs to be done once. The app lets you control all functions, switch modes, and customize the display.

The glasses can operate in offline mode, using onboard voice recognition when a network is unavailable. However, online use is recommended for best accuracy.

Leion Hey AR Glasses come with several customizable options, including magnetic sunglasses attachments in four styles, prescription lens support with an internal frame for local optician fitting, multiple temple gasket sizes (S/M/L), and a USB-C charging cable.

Like the XRAI AR 2, the upcoming Leion Hey 2 glasses, set to launch in 2025, are built on LLVISION’s G36 platform. They’re expected to weigh around 40 grams and offer an 8-hour battery life, combining lightweight comfort with all-day functionality.

Promotional video featuring LEION Hey 2 AI glasses produced by the manufacturer.

“We’ve just come back from the Global Disability Summit (GDS) in Berlin last month [April 2025], and it achieved lots of positive feedback,” Shenqiang (Roy) Lou, Chief Operating Officer at LLVISION, told HearingTracker about the new generation of Leion Hey.

In April 2022, Leion Hey was honored with the Top 10 Global Science and Technology Innovation Award at the UNESCO Netexplo Innovation Forum.

The latest version of Transcribe Glass is now integrated into a pair of stylish glasses.
The latest version of Transcribe Glass is now integrated into a pair of stylish glasses.

TranscribeGlass

Price per pair is $377 + $20/mth.

Recent Yale computer science graduate Madhav Lavakare from New Delhi, India, got the idea for captioning glasses after a friend dropped out of high school because he couldn’t afford accessibility accommodations. When Lavakare asked his friend why he didn’t use speech recognition on his phone, he replied, “If I’m reading my phone, I can’t read lips, facial expressions, or hand gestures.” That conversation sparked a mission.

Lavakare started TranscribeGlass in his garage in 2017 with a clip-on captioning device. But after years of user feedback and refinement, he realized the limitations: clip-ons were hard to secure on different frames, were not user-friendly, and lacked accuracy and reliability.

The first version of TranscribeGlass, a low-cost affordable assistive technology for people with hearing loss. It utilized the speech-to-text software of your choice (via your smartphone) and then projected it onto a small, snap-on display situated in front of your glasses (or empty frames if you don't wear glasses).
The first version of TranscribeGlass, a low-cost affordable assistive technology for people with hearing loss. It utilized the speech-to-text software of your choice (via your smartphone) and then projected it onto a small, snap-on display situated in front of your glasses (or empty frames if you don't wear glasses).

By 2024, after testing various smart glasses and interviewing users, he identified five key needs for the product: comfort, accuracy, speed, aesthetics, and above all, reliability.

In 2025, after a year of development, TranscribeGlass 2.0 was born. Designed to look like regular glasses, the new, fully integrated version delivers highly accurate captions even in noisy or crowded places like restaurants. “Now the device is really sleek,” Lavakare told HearingTracker, “It's almost indistinguishable from a pair of normal reading glasses.”

TranscribeGlass Founder Madhav Lavakare explains the genesis of the product and how the latest design works.

Lavakare sourced Vuzix’s Ultralite smart glasses and paired them with the TranscribeGlass app, currently available on iPhone, with an Android version expected soon. The glasses connect to the app via Bluetooth, letting users customize the settings.

TranscribeGlass uses your phone’s built-in mic (or an external one if connected) to pick up conversations. Real-time transcriptions appear in green text on the right lens. The software can identify when different people are speaking by assigning each a number. Building on this, you can register a friend’s voice by recording a short sample, so their name appears in the captions. You can even register yourself and choose to hide your own speech from the captions—useful if you don’t want to see your own words while wearing the glasses. While still experimental, these features make captions clearer and more personalized.

Their transcription process is GDPR, SOC 2 Type 2, and HIPAA compliant, which means they handle your data carefully and securely, following important privacy and security laws.

A monthly subscription covers the cost of the transcription service that the glasses rely on. TranscribeGlass works both online and offline, though accuracy decreases without an internet connection. When connected to the cloud, they boast about 93% accurate speech recognition, with higher accuracy in quiet environments.

The device charges via a magnetic port and reaches a full charge in about one hour, providing around 6 to 8 hours of continuous captioning on a single charge, depending on the brightness level. With typical intermittent use, many users can go an entire day without needing to recharge.

The company is exploring AI assistant features, including ChatGPT-style functionality, while carefully considering what benefits their users without causing distractions. Although not yet part of the standard product, a beta version with an AI assistant is available. Users can activate it by double-tapping the glasses, and it responds to questions based on the context of the ongoing conversation. This feature isn’t included by default, but if customers request an AI assistant, TranscribeGlass offers them access to try it out and share their feedback.

When ordering from the TranscribeGlass website, you can add clip-on sunglasses for $20 or prescription lenses starting at $158 for single-vision. By entering your prescription at checkout, you confirm compatibility and get an exact price.

TranscribeGlass supports multiple languages, including English, Spanish, French, Chinese, Italian, German, Portuguese, Korean, Japanese, and Vietnamese. It also offers live translation between most of these languages.

TranscribeGlass secured third place at the 2025 Startup Yale (Rothberg Catalyzer prize), Ben Daniels Venture Challenge, and 2025 Tulane Business Model Competition, as well as the 2021 Tech4Good Accessibility Award (Highly Commended) and ATF Labs Best Assistive Technology Startup for Innovation.

Hearsight 

*Price per pair is $299.95 + *app membership plan

Based in South Bend, Ind, HearSight is a tech startup focused on accessibility. Their mission is clear: To empower people who are d/Deaf & Hard of Hearing (HOH) with innovative solutions that contribute to better speech comprehension and improved quality of life. 

The idea for Hearsight was born during the COVID-19 pandemic, when co-founder Danny Fritz saw how mask-wearing made communication difficult for his girlfriend, who relied on lip reading. Inspired by movie subtitles, Fritz wondered if real-life conversations could be captioned similarly. Later, through a tech entrepreneurship program at the University of Notre Dame, he teamed up with Riley Ellingsen to bring the idea to life.

The Hearsight app has been carefully tested and improved through user feedback and is now available on both iOS and Android. In June 2024, they publicly launched their captioning glasses solution in partnership with MICROOLED, a leader in OLED micro-display technology. The ENGO 2 smart glasses by ActiveLook pair with the Hearsight app to provide real-time subtitles for everyday conversations, just as co-founder Fritz first imagined.

The ENGO 2 glasses feature built-in microphones to capture speech, offering a lightweight, low-power design that ensures clear, high-contrast captions even in varying lighting conditions. While the Hearsight app works independently, pairing it with the ENGO glasses provides a hands-free captioning experience.

There is no need to activate the glasses using third-party apps. Simply turn Bluetooth “ON” and click the device name to pair. After each session, conversations are temporarily logged, giving users the option to save or discard transcripts.

ENGO 2 glasses provide up to 12 hours of battery life, allowing users to rely on them for extended use without frequent recharging. They are charged with a Micro USB cable.

Prescription lenses are not currently supported, but a photochromic version of the glasses, which automatically darkens in response to sunlight, is available.

Hearsight won the 2022 RISE Award, recognizing startups with innovative solutions that positively impact society. They also gained attention at CES 2025.

Captify glasses.
Captify glasses.

Captify 

Price per pair is $499 USD + free companion app

The new kids on the block, Captify glasses, were co-founded by Tom Pritsky, who lives with hearing loss and previously worked on the first-generation TranscribeGlass, and Jason Gui, a seasoned expert in smart wearables.

The glasses aim to bridge communication gaps and enable more inclusive conversations. Pritsky began developing the idea for the glasses at Stanford in 2016, driven by both a personal mission and a desire to address the challenges faced by the 1.5 billion people worldwide who encounter similar barriers. Unveiled at CES 2025, Captify glasses are purpose-built for accessibility.

Captify-produced video with co-founder Tom Pritsky explaining the new captioning glasses.

Captify partnered with MYVU, a brand of lightweight AR glasses developed by DreamSmart Group. Together, they’ve customized the hardware to support Captify’s unique captioning features, powered by cloud-based speech recognition from Microsoft, Google, and Amazon.

Captify glasses use dual-beamforming microphones to focus on the speaker in front of you while minimizing background noise. The captured audio is sent via Bluetooth to your smartphone, where the free partner app, MYVU AR (which is compatible with both iOS and Android devices), handles transcription. This separation of processing from display keeps the glasses lightweight and power efficient. Once the app transcribes speech, captions are projected in green text directly onto the lenses. Captify stores transcriptions through the companion app, allowing users to review and save key points from conversations.

Users can also adjust font size (Regular, Large, Extra-Large) in the app. In a later update, the Captify app will offer an option to adjust the position of the captions so they can be on the top, bottom, or sides of your view.

For captions to work, your phone needs to stay within Bluetooth range (10 meters or 30 feet). No internet is needed for basic captioning, but translation features require cloud access, so a Wi-Fi or cellular connection is necessary for those. Local transcription via the connected smartphone is free.

The battery provides up to 4 hours of continuous captioning or 48 hours on standby and fully recharges in just 40 minutes. Plus, the glasses can continue captioning while charging.

The glasses are ergonomically designed for all-day comfort across a wide range of head sizes. They support prescription lenses, including single vision, reading, progressive, and non-corrective options, which come pre-installed. Prices start at $99 USD and vary by lens type. Simply upload your prescription at checkout. Polarized sunglasses clip-ons are also available for an additional $50.

The current version of Captify Glasses supports real-time translation in 13 languages: English, Spanish, French, Italian, German, Chinese, Japanese, Russian, Vietnamese, Malaysian, Indonesian, Turkish, and Thai. The upcoming Captify Pro, launching in July 2025, will expand support to over 40 languages.

While Captify Glasses don’t currently include built-in ChatGPT or third-party AI assistant support, they do feature smart capabilities, such as translation, media playback, and call handling. These features are powered by a Wi-Fi connection and will continue to evolve through future updates.

Users can access, update, or delete their information and choose to opt out of data sharing, depending on their location and applicable laws. Full details are in their Privacy Policy and Terms of Service.

Purchases can be made using Flexible Spending Accounts (FSA) or Health Savings Accounts (HSA), and installment payments are supported.

Modular Alternatives

While the options above are purpose-built for real-time captioning, it’s worth mentioning that some general-purpose smart glasses, like the Even Realities G1 paired with the open-source AugmentOS platform, can also provide live captions and transcription through optional apps. These solutions offer broader functionality alongside captioning (e.g., translation, navigation, productivity tools) and may require more setup.

What’s Next from the Big Players?

While the dedicated captioning glasses covered in this article are specialized devices focused on accessibility, tech giants like Google and Apple are making significant strides in real-time transcription eyewear as part of their broader wearable technology and accessibility initiatives. Though not strictly dedicated captioning glasses, these advancements could soon incorporate live captioning features within a wider array of functionality.

Google's Smart Glasses Initiatives

Samsung's Project HAEAN: In his TED2025 talk, "The Next Computer? Your Glasses," Shahram Izadi, Google's head of AR/XR, unveiled AI-powered smart glasses developed with Samsung. Powered by Android XR and Gemini AI, they demonstrated real-time language translation and contextual memory recall. With features like visual memory retrieval and AI-driven scene analysis, the glasses recall Black Mirror-type scenarios. The live demo showed Farsi-to-English translation displayed on the lens, highlighting their potential as next-generation speech-to-text glasses for live captioning.

A YouTube video produced by Google that features glasses that use Google Translate. Closed captions are available on this video. If you are using a mobile phone, please enable captions clicking on the gear icon.

Xreal's Project Aura: Announced at Google I/O 2025, this initiative is a partnership between Google and Xreal. Project Aura is an optical see-through (OST) device that runs on Google's Android XR operating system and is powered by Gemini AI. It features a lightweight, tethered design with transparent lenses that overlay digital content onto the real world. The glasses include cameras, microphones, and speakers, enabling functionalities like real-time translation and contextual assistance. 

Both projects are currently in the prototype stage, with no confirmed release dates. 

TechAvid YouTube video about Xreal and Google's Project Aura.

Apple's Vision Pro and Upcoming Smart Glasses

Although not strictly glasses, Apple’s Vision Pro headset, released in 2024, includes a Live Captions feature that transcribes both live conversations and audio from apps in real-time.

Looking ahead, Apple plans to launch its first smart glasses in 2026. These glasses are expected to feature cameras, microphones, speakers, and Siri integration, enabling functionalities like phone calls, music control, live translations, and navigation. These developments suggest that live captioning features could be part of Apple's smart glasses offerings.

Apple Vision Pro headset does offer transcription of live conversations and audio from apps in real time, but the company is reportedly coming out with its first smart glasses in 2026 which may have similarly useful accessibility features.
Apple Vision Pro headset does offer transcription of live conversations and audio from apps in real time, but the company is reportedly coming out with its first smart glasses in 2026 which may have similarly useful accessibility features.

Captioning Apps Are Also Extremely Useful!

While this article focuses on glasses that provide real-time captioning, some of the apps mentioned above can also be used independently on your smartphone. In addition, several standalone speech-to-text captioning apps offer fast, reliable captions and can be a handy option for live captioning on the go.

Check out:

HearingTracker has also published an article about the best apps for people with hearing loss.

Technology for hearing accessibility is changing fast. Also related to eyewear (but not captioning), you might also be interested in checking out the emerging category of hearing glasses, which includes the new Nuance Audio OTC hearing glasses.

Privacy and Data Handling Considerations

Since captioning glasses rely on capturing real-time audio, sometimes processed in the cloud, they raise understandable privacy concerns. These devices may pick up private conversations, and depending on the provider, transcripts could be stored locally or online.

It’s important to know where your data goes, how long it’s kept, and who can access it. Some providers may also share data with third parties—so checking the privacy policy is key.

Tip: Before using any captioning glasses, review the privacy policy, check if your data is encrypted, and see whether you can disable cloud features or opt out of data sharing.

The Future of Captioning Glasses

The world of captioning glasses is moving fast. With new hardware and software updates being rolled out regularly, these devices are becoming increasingly versatile and user-friendly. As Xander’s Alex Westner notes, “I expect captioning glasses will continue to evolve physically to address the different needs, usage goals, and individual sensitivities of users.”