I was impressed. This was the strongest example I had seen of AI working in a seamless, practical way that could be beneficial for lots of people. Children of immigrants who prefer to speak their native tongue may have an easier time communicating. Travellers visiting foreign countries may better understand cab drivers, hotel and airline staff.
It would also help me in my day-to-day life with understanding a contractor or pest controller who doesn’t speak English and is trying to explain what he found under my house.
And frankly, I was also surprised. Apple’s foray into generative AI, the technology driving chatbots such as OpenAI’s ChatGPT and Google’s Gemini, has been rocky, to say the least. The company never finished releasing some of the AI features it promised for last year’s iPhone 16 because the technology didn’t work well. And its AI tools for photo editing and summarising articles have been disappointing compared with similar tools from Google.
The robust translation technology in the AirPods is a sign that Apple is still in the AI race, despite its early stumbles. Digital language translation is not new, but Apple’s execution of the technology with the AirPods should make a profound difference to how often people use it.
For more than a decade, consumers have fumbled with awkward translation apps on their phones, such as Google Translate and Microsoft Translator. They required users to hold their phone’s microphone up to the person speaking another language and wait for a translation to be shown on a screen or played through the phone’s tiny speakers. The translations were often inaccurate.
In contrast, AirPods users need only to make a gesture to activate the digital interpreter. About a second after someone speaks, the translation is played in the wearer’s preferred language through the earbuds.
Here’s what you need to know about how to use the translator, how the technology works and why it is likely to be better than past translation apps.
Getting started
Setting up the AirPods Pro was simple. I opened the case next to my iPhone and tapped a button to pair the earphones. To use the translation software, I had to update to the latest operating system, iOS 26, and activate Apple Intelligence, Apple’s AI software.
Then I had to open Apple’s new Translate app and download the languages I wanted to translate. Spanish, French, German, Portuguese and English are available right now, and more are coming soon. I selected the language the other person was speaking (in this case, Spanish) and the language I wanted to hear it in.
There are a few shortcuts to activate the interpreter, but the simplest way is to hold down on both stems of the AirPods for a few seconds, which will play a sound. From there, both people can start speaking, and a transcription shows up in the Translate app while a voice reads the translated words out loud.
Owners of the AirPods Pro 2 from 2022 and last year’s AirPods 4 with noise cancellation can also get the translation technology through a software update. A recent iPhone, such as the iPhone 15 Pro or a device from the 16 series, is also required to use Apple Intelligence to do the translations.
For a fluid conversation to be translated in both directions, it is best if both people are wearing AirPods. Given how popular Apple’s earbuds already are, with hundreds of millions sold worldwide, this feels quite probable.
Yet there are times when this tech will be useful even with only one person wearing AirPods. Plenty of immigrants I interact with, including my nanny and mother-in-law, are comfortable speaking only in their native tongue, but can understand my responses in English, so my being able to understand them, too, would go a long way.
Why translations are getting better
The AirPods’ reliance on large language models, the technology that uses complex statistics to guess what words belong together, should make translations more accurate than past technologies, said Dimitra Vergyri, a director of speech technology at SRI, the research lab behind the initial version of Siri before it was acquired by Apple.
Some words carry different meanings depending on the context, and large language models are capable of analysing the full scope of a conversation to correctly interpret what people are saying. Older translation technologies were doing piecemeal translations one sentence at a time, which could result in big mistakes because they lacked context, Vergyri said.
Yet the AI technology in the AirPods might still have blind spots that could lead to awkward social situations, she added. Words alone don’t account for other types of context, such as emotions and cultural nuances. In Morocco, for example, it may be impolite to jump into a conversation without a proper greeting, which often involves asking about a person’s family and his or her health.
“The gap still exists for real communication,” Vergyri said. She added, however, that translation technology would become increasingly important as corporate workers became more global and needed to communicate across cultures.
This article originally appeared in The New York Times.
Written by: Brian X. Chen
Photographs by: Sisi Yu
©2025 THE NEW YORK TIMES