Apple’s New Live Translation Feature Is Useful For Texts, Calls, and Video Chats - Here's How

Reading time icon 3 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

Apple rolled out a major update at WWDC 2025, unveiling Live Translation in iOS 26 with support across Messages, FaceTime, and Phone calls. This feature, powered by Apple-built on-device models, delivers instant translation without relying on external servers- keeping conversations private .

In Messages, the update automatically translates outgoing texts into the recipient’s language as users type. Incoming replies return already translated. Apple’s demo showed seamless multilingual group planning-perfect for arranging travel details abroad. The translation occurs in real time and remains invisible to anyone outside the conversation .

During FaceTime calls, participants see live captions in their language while still hearing the speaker’s voice. Apple emphasizes that this happens locally on-device, avoiding external data transmission .

The new feature also works with Phone calls. Speakers hear their own words translated aloud for the recipient. When the other party responds, their speech translates back in real time, making bilingual telephone conversations effortless .

Other recent Apple news –

Apple has extended developer access with a fresh API, enabling third-party communication apps to support Live Translation. Leslie Ikemoto, Apple’s director of input experience, called this tool “conversation on the fly” .

Apple hasn’t yet revealed how many languages it supports. The press release points to initial support in widely used languages, including English (US/UK), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Chinese, on compatible devices.

Wired, The Verge and others place this update under the broader Apple Intelligence initiative, introduced at WWDC alongside the “Liquid Glass” visual overhaul and enhancements to Visual Intelligence . Unlike cloud-dependent AI from competitors, Apple processes translations entirely on device, protecting user data .

Live Translation joins features like call screening, Hold Assist, and instant text translation, offering a tightly integrated system update. Apple aims to release this publicly this fall with iOS 26. Developer beta becomes available now; public beta lands next month .

What this means for users

  • Text messages adapt to any conversation without language barriers.
  • Video calls include subtitles in your language while keeping the speaker’s voice intact.
  • Voice calls automatically translate back and forth, smoothing communication.
  • Developers gain tools to bring translation to more apps.

This falls in line with Apple’s privacy-first strategy: translation happens offline. Users should expect global language support when iOS 26 launches publicly.

You may also be interested to read –

More about the topics: Apple

User forum

0 messages