The insiders say that we are talking about the function of a foreign language in a real -time translator. The function can be activated by double -pressing headphones.
This information confirms Mark Gurman from the inside of Bloomberg on Apple’s previously published information on Apple’s Plans to promote the synchronous translation function in AirPods. It has already been trained that the live translation function is already applied in the phone, messages and facetime applications, but the headphones will allow this function to be used faces between the two users.
In order to activate the live translation function, both dialogues will need to use AirPods headphones. Each users will then be able to detect the translation of another user’s speech language.
According to Insides, the live translation function will be available on AirPods Pro 2 and AirPods 4. Since existing translation systems are integrated with this AI, it is assumed that the function is necessary to work correctly to support Apple Intelligence Service. The live translation function is expected to be represented at the same time with the publication of the iOS 26 operating system in mid -September or in one of the next updates.
Source: Ferra

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.