
Aad Goudappel

Aad Goudappel

AirCaps glasses allow a user to simultaneously watch a person speaking and a scrolling transcription of their words. Inventor Madhav Lavakare ’25 declined to give numbers but says the glasses—which cost $649 in early 2026—are selling briskly.
View full image

AirCaps glasses allow a user to simultaneously watch a person speaking and a scrolling transcription of their words. Inventor Madhav Lavakare ’25 declined to give numbers but says the glasses—which cost $649 in early 2026—are selling briskly.
View full image

Cathy Shufro
“They have a Prada vibe going,” says James Dover ’29 (right), a first-year from North Carolina who is hard of hearing. Running into friends Yavin Fickel ’29 (left) and Henry Greenwold ’29 (center) while trying out the AirCaps glasses on campus, Dover read the conversation, transcribed, as he chatted with them.
View full image

Cathy Shufro
“They have a Prada vibe going,” says James Dover ’29 (right), a first-year from North Carolina who is hard of hearing. Running into friends Yavin Fickel ’29 (left) and Henry Greenwold ’29 (center) while trying out the AirCaps glasses on campus, Dover read the conversation, transcribed, as he chatted with them.
View full image
In February 2025, a Yale Daily News story caught my eye: A senior named Madhav Lavakare ’25 had invented live-captioning eyeglasses for people who are hard of hearing or deaf. The current iteration of the glasses connect to a microphone and instantly convert spoken words into text, projecting glowing green captions that float in front of the user’s eyes. The glasses allow a user to simultaneously watch the scrolling text and the person speaking—and they can translate conversations into over 60 languages.
I checked the AirCaps website: The black-frame glasses look normal but have a projection screen embedded in the lens. Although the screen sits close to the eye, the captions appear farther out.
I was intrigued. I imagined the glasses connecting people with limited hearing to the chattering world, allowing them to join in their friends’ repartee in a noisy coffee shop, follow a lecture from any seat in the auditorium, immerse themselves in a movie.
Could captioning glasses prove transformative? This past fall, I decided to find out. Do they work? (Yes, with a few glitches.) Would people with hearing deficits or deafness want to use them? (Yes! And no.) What might be the downsides? By enlisting glasses testers on campus, talking to a Yale instructor who is Deaf, and interviewing a Yale linguistics professor and a hearing loss researcher, I began to explore the possibilities.
But first, I called Lavakare at his San Francisco office. I asked how he’d gotten the idea for transcribing glasses. Turns out, he’d been working on the glasses since he was 16. This was after a hard-of-hearing classmate dropped out of their high school in Delhi, India. The classmate told Lavakare that with hearing aids alone, he often couldn’t follow everything his teachers said. He tried voice-to-text transcription, but that posed problems: To fully understand, he needed to watch the teacher’s gestures and facial expressions—impossible while monitoring captions on a screen.
Lavakare decided that he would solve that problem. He would find a way to project words onto eyeglasses, so his friend could read a transcription while keeping an eye on the teacher. Within months, Lavakare had assembled his first prototype.
That was in 2017. He kept tweaking the model throughout high school.
When Lavakare was admitted to Yale in 2019, he showed his parents a 15-minute PowerPoint proposing a gap year. One year became two, both spent working on the glasses. Lavakare was chosen for an incubator program—a hub for startups—at the Indian Institute of Technology, Delhi. He kept building glasses, and he began networking at technology conferences.
After arriving in New Haven to major in computer science, Lavakare used breaks and summers to work on the project. He landed several grants, including a total of $70,000 won in pitch competitions at Cornell, Tulane, and Yale.
Eight years elapsed from when Lavakare built the prototype to the day he shipped the first pair of glasses to a customer in late February 2025. I asked him whether, at age 16, he’d had any idea of the effort this project would require.
“Absolutely not,” he replied. “It was a thousand times harder than I could ever have imagined.”
I was eager to see the glasses firsthand, and I’d found two volunteers to review them. Surprisingly, when I asked to borrow a pair in September, Lavakare seemed reluctant.
“We have limited units that aren’t committed to customers,” he wrote to me.
Those customers had discovered the company long before Lavakare had done any advertising—unless you count a video of an early version of the visual captioning technology on TikTok in July 2023. Reposted on other platforms, Lavakare says the video attracted 75 million views. Commenters asked where to buy the product. Articles in Smithsonian and Fast Company followed. As the cofounders showed off the fledgling tech in interviews, “there was a lot of inbound interest,” Lavakare said. One man commented online: “I’d love to be able to talk to my dad again.”
Lavakare said that more than ten thousand people joined a waiting list. But “at that point,” he said, “We didn’t really have anything to sell, and I was at school and wasn’t actively working on this.”
Would-be buyers had to wait. In summer 2024, Lavakare chose the Vuzix Corporation in Rochester, New York, to customize its smart glasses for his company, initially called TranscribeGlass and later rebranded as AirCaps.
As Lavakare graduated from Yale in May 2025, news coverage accelerated. The New Yorker interviewed him for an April article with the headline: “Subtitling Your Life: Advances in transcription are good news for the hard of hearing.” (The story also mentions XanderGlasses, one of several competitors.) A few months later, Wired featured Lavakare’s glasses, reporting that the system “works almost eerily well.”
The glasses—which cost $649 as of January—are selling more briskly than expected, Lavakare told me recently, though he declined to give numbers. People recommend them to their friends: “We’re seeing a lot of organic referrals.”
In November, UPS messaged me that I had a package from San Francisco: the glasses for testing.
I immediately wrote to Yale student James Dover ’29, a gregarious first-year from North Carolina who was eager to try them. Dover has been deaf in one ear since birth because of an inner-ear abnormality. Like more than 90 percent of children with deafness, he was born to hearing parents; both his brothers, including his twin, have normal hearing. His cochlear implant generally works well, but for Dover, listening isn’t second nature: It’s exhausting labor.
We met the next day. Dover found the boxy black glasses reasonably stylish, saying, “They have a Prada vibe going.” We could see the rectangular projection screen sandwiched inside the right lens, barely discernible. Dover used Bluetooth to connect the glasses to my phone, which had the app installed, and we headed for the noisiest place we could think of.
Elm café in the Schwarzman Center was buzzing, with students in every seat, some hunched over laptops, many chatting. It’s the kind of setting where Dover’s cochlear implant fails him; voices and other noises mash together to sound like surf.
Dover quickly spotted three friends and joined their table. He could follow their conversation and distinguish who was speaking, because the app gave each speaker a number. Within minutes, he began testing the translation function. His friends seemed delighted to speak to him in Urdu and Spanish and see the words appear in English. For Dover, the translation function seemed the most engaging.
“That was something special,” he said as we left the café. He wished the glasses came with a microphone he could clip to his shirt; it was tedious to hold the phone to make its microphone available.
Would he consider buying the glasses? Dover said yes. “They’d be good for situations where I generally have trouble”—in a restaurant or dining hall, or at some future professional meeting, perhaps one that’s multilingual.
Even so, Dover will continue to continue to study American Sign Language at Yale so he can maintain his bonds with Deaf people who don’t use English. Being hard of hearing, he said, “is part of who I am.”
For my other tester, ecology and evolutionary biology professor Richard Prum, the immediate appeal of the smart glasses was the chance to fully enjoy a movie.
Because Prum has lost much of his hearing, he often misses crucial movie dialogue. (“And the secret of life is . . .” ???) For unclear reasons, his right ear abruptly failed him nearly four decades ago, when he was a PhD student—a life-altering deprivation for a birdsong specialist. It got worse: Ménière’s disease went on to destroy the hearing in his other ear by 2004. (The cause of Ménière’s is unknown.) Prum’s hearing aids allow him to hear well under ideal conditions: a quiet room, and no mumblers or children with high voices.
Prum was disappointed when he took the glasses to a screening of Paul Thomas Anderson’s One Battle After Another, a comedy-action film with both car chases and conversation. He wore the AirCaps glasses over his prescription glasses; if he owned them, an optician could install his prescription lenses into the frame. Wi-Fi was blocked in the theater, but Prum accessed the app using his phone carrier. Still, he recalled, “I could see the transcription was messing up. So even with hearing deficits, at times I did better than the phone.”
In contrast, the glasses worked astonishingly well for Prum during a walk on campus with a mutual friend. As we passed a construction site near the Peabody Museum, roaring machines nearly drowned us out, but the AirCaps glasses almost instantly kept Prum in the loop. They also provided reliable captions as we walked around inside Kline Tower. “Under the best conditions,” Prum said afterward, “the glasses proved much better than I’d imagined possible.”
Like Dover, Prum was enthralled by the potential of AirCaps glasses to transcribe or translate foreign languages. He speaks four languages besides English, with varying degrees of fluency. Partly because of hearing loss, he interprets sounds in those languages with less agility than in English. Sometimes speech moves too fast for him to decode words he actually knows. The glasses could help him keep pace.
For example, if someone were speaking to him in Portuguese, he’d have a choice: He could use the glasses to supply captions in Portuguese, or he could listen to the spoken Portuguese while reading captions in English, to fill in gaps. Either way, he could follow what was said and pick up new vocabulary. “I think anyone learning a foreign language can benefit from this,” Prum said.
I’d returned the glasses by the time I got to interview Andrew Fisher, one of six American Sign Language (ASL) instructors in the Yale linguistics department. Fisher is “third-generation Deaf” and is fluent in ASL and written English. He seemed only mildly disappointed that he couldn’t try the glasses, because he doesn’t think they’d be very useful for him. That’s largely because Fisher doesn’t generally speak. “I think the glasses would really benefit people who become deaf later in life, because they have speaking skills,” he says.
Not Fisher. “Great, we can receive information [using the glasses]. But communication goes two ways,” he said. Fisher relies on interpreters: the interpreter who accompanied him during our interview made it possible, and Fisher uses a live interpreter when performing stand-up comedy. (See @anandrewfisher on Instagram.)
I asked him how he communicates with hearing people who don’t know ASL—say, when he can’t find something in a supermarket. He types his question into his phone and shows it to a clerk, with the microphone turned on. The clerk’s reply appears on the phone screen. For more complicated interactions, Fisher uses relay services, such as free phone app Aira ASL. It connects him via video with a professional ASL interpreter who can translate for him. The app is just a year old and has limited coverage; although a handful of colleges and universities partner with Aira ASL, Yale does not.
Fisher said he might find transcribing glasses handy in an airport or at a movie. But he notes that overestimating the usefulness of an attractive new technology can do harm. “I’m concerned people are going to think, ‘oh we have these glasses,’” and curtail services for Deaf people.
Yale linguistics professor Claire Bowern shared Fisher’s skepticism. Bowern pointed out potential drawbacks of relying on AI-generated transcripts. “Transcription services are only as good as the models that underlie them, and all the issues with accuracy that we see elsewhere are going to apply here, too,” Bowern wrote in an email. “Speech-to-text works way better for some language varieties and subjects than others, and that’s a problem, particularly when people assume that they work ‘well enough.’”
She pointed out that we seem to accept errors in “everyday use cases,” as when AI translates a recipe or transcribes TV news. But in other settings, such as during a meeting between doctor and patient or during a legal trial, seemingly minor mistakes can do great harm.
In addition, captions lack context that colors meaning. Context includes tone: “You ate my birthday cake?” differs from “You ate my birthday cake.” Bowern gave her own example of how a written account of speech can be murky. “I went to see Wicked: For Good with my daughter,” she told me. “It was 137 minutes long.” A transcription would not capture her tone and pacing—“It was 137 minutes. Long.”—which suggested that she found the movie tiresome. (In fact, she enjoyed it.) Context is so crucial that both linguists and philosophers of language study its contribution to meaning in a field called pragmatics.
Bowern asked me about AirCaps’s privacy policy: “Are the glasses doing everything in a closed system or sending information to the cloud?” Would users be able to control what happens to records of conversations in the cloud? (Lavakare said AirCaps transcriptions use the cloud except when offline, but that conversations are not identified with specific users and are not accessible to AirCaps.)
I wondered if transcribing glasses could protect against dementia. Studies I read estimate that midlife hearing loss accounts for 9 percent of dementia cases. People with hearing loss often withdraw from social life, and isolation contributes to dementia risk. I thought of my mother in her last years, when even with hearing aids, she could barely hear. It’s hard to say if her friends in assisted living began to avoid her, or she them, but soon she was eating alone in the dining room.
I described AirCaps glasses to Yale Professor of Surgery Hong-Bo Zhao, who studies the relationship between hearing loss and Alzheimer’s disease. Without long-term research, he could only conjecture about the effects of captioning glasses, but he told me that when people are able to maintain their social lives, they’re less likely to develop dementia. So the glasses might help.
I read that when hearing fades in older adults, the effort required to hear can sorely tax the brain. This effort undermines cognitive control, the ability to orient our thoughts and actions towards a goal. It would be useful to know if captioning glasses lighten cognitive load.
Research has already shown that hearing aids protect against dementia. But there’s a problem with humans and hearing aids: More than half of people 75 and older have disabling hearing loss, but only around one in five Americans who need hearing aids uses them, according to the National Institutes of Health. A long-term study showed that even people who eventually adopt hearing aids wait an average of nine years after diagnosis.
Lavakare recalls his frustration when his late maternal grandmother refused them. “She never wanted to admit that she had hearing loss,” Lavakare recalls. “She would tell the audiologist, ‘I’m totally fine, I’m perfectly fine.’ The audiologist would say, “Yes, you’re all okay,’ and then we’d whisper, ‘No, she’s got 90 percent hearing loss!’ Oh my god!”
AirCaps glasses could have helped my mother, and, likely, Lavakare’s grandmother. And then there are people with normal hearing who have trouble interpreting sounds, as in auditory processing disorder. Captioning glasses could make life easier for them, too.
So yes, my research suggests that AirCaps glasses can keep users in the conversation, even in a noisy place, and that they can allow anyone who leaves their phone on a lectern to follow what’s said. They’d be handy for travel abroad.
The system has weak points and drawbacks. But it’s early days. Customers in more than 30 countries are already wearing AirCaps glasses, discovering what transcribing glasses make possible.