Apple brings Eye Tracking to iPhone and iPad

Reading time icon 4 min. read

Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • Other updates include Vehicle Motion Cues for motion sickness, CarPlay accessibility features, Apple Vision Pro improvements, and updates to VoiceOver, Magnifier, Braille, Hover Typing, Personal Voice, Live Speech, Virtual Trackpad, and Switch Control.
Apple Eye Tracking iPhone iPad

Apple today announced several new accessibility features coming to its products in the coming months. The main highlight of the announcement is the Eye Tracking support in iPhone and iPad.

With the upcoming Eye Tracking support, you will be able to control your iPhone and iPad just using your eyes. It even works across all iPadOS and iOS apps without any additional hardware or accessories. You can navigate through the UI elements of any app and use Dwell Control to activate each element. You can also access physical buttons, swipes, and other gestures just using your eyes. This new Eye Tracking feature uses the front facing camera on your iPhone and iPad. Thanks to AI and on-device machine learnings, Eye Tracking feature works entirely on your device.

Music Haptics is another new accessibility feature which will help users who are deaf or hard of hearing to experience music on iPhone. With this feature, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music. Right now, this feature works with Apple Music app, but Apple will make Music Haptics available as an API for developers to make music more accessible in their apps.

With the new Vocal Shortcuts, you can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. The new Listen for Atypical Speech feature enables enhancing speech recognition for a wider range of speech.

Vehicle Motion Cues is a new feature that can help reduce your motion sickness when you are in moving vehicles. When this feature is turned on from Control Center, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content.

CarPlay gets the following accessibility features:

  • With Voice Control, you can navigate CarPlay and control apps with your voice.
  • With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens.
  • For users who are colorblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.

Apple Vision Pro is getting the following accessibility improvements:

  • Systemwide Live Captions to help everyone.
  • Capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors.
  • Addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

Other accessibility feature updates:

  • For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.
  • Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.
  • Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad; and the option to choose different input and output tables.
  • For users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color.
  • For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.
  • For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions.
  • For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad.
  • Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.
  • Voice Control will offer support for custom vocabularies and complex words.

More about the topics: apple, Eye Tracking, ipad, iphone