WWDC 2024 is just around the corner. Manzana took the opportunity to announce several accessibility features coming soon to iPhone, iPad, and other devices. The brightest of them is called Eye tracking (eye tracking) and will allow control your mobile phone or tablet using only your eyes. Although this has not yet been confirmed, it is logical to assume that it will be available with the release of iOS 18 and iPadOS 18.
Of course, this new feature was designed for users with limited mobility that prevents them from physically interacting with the iPhone or iPad screen. In any case, it will be available in the accessibility settings of both devices for everyone who wants to try it.
The most interesting thing about this feature, which we will surely see in iOS and iPadOS 18, is that gaze control will be built in. Previously, Apple allowed you to interact with the iPad using only your vision, but this required an external eye-tracking accessory.
Representatives from Cupertino explain that the new Eye Tracking feature will use the front camera of the iPhone or iPad to detect and track users’ gaze. This function can be activation and calibration in seconds and will take advantage of machine learning running directly from devices to power their operations.
While eye control will certainly be included in iOS/iPadOS 18, that doesn’t necessarily mean it will be exclusive to those versions of the software. It will likely also be extended to older versions of operating systems, with final compatibility announced at WWDC 2024.
Apple will let you control your iPhone or iPad with your eyes starting with iOS and iPadOS 18
Apple shared a video showing how Eye Tracking works. The clip features a young woman in a wheelchair interacting with an iPad using only her eyes. In the demo, you can see that you can open the app, navigate through its different sections, and do other things (in this case, play a podcast).
Besides eye tracking itself, this iOS and iPadOS 18 feature leverages other technologies that already exist in the Apple ecosystem. We are talking about Constancy or Latency controlwhich has long been present in macOS and allows you to perform mouse actions using your vision or face.
“With eye tracking, users can navigate elements of the app and use Dwell to activate each element, accessing additional features such as physical buttons, swipes and other gestures using just their eyes.”
Manzana
But controlling your iPhone or iPad with your gaze won’t be the only accessibility feature coming to iOS 18. Another interesting new feature will be the addition of Musical haptics, which will use the iPhone’s vibration motor to create musical experiences for people who are deaf or hard of hearing. According to Apple, this option is already compatible with millions of songs on Apple Music.
Additionally, iOS 18 will include new sound shortcuts this will allow users to define their own expressions to make Siri understand and perform certain actions. An option will also be added to help reduce motion sickness when using the iPhone in a moving vehicle.
Apple also plans to introduce more accessibility features in Apple Vision Pro – live subtitles, for example – and in Car game. The latter will include options to control apps using your voice and new tools for people with color blindness and hearing problems.
Source: Hiper Textual

I’m Ben Stock, a highly experienced and passionate journalist with a career in the news industry spanning more than 10 years. I specialize in writing content for websites, including researching and interviewing sources to produce engaging articles. My current role is as an author at Gadget Onus, where I mainly cover the mobile section.