Function Apple Vision Pro would reveal passwords, secret numbers and all sorts of personal information. Researchers found that eye tracking can be used to decipher what you type on your screen and access this information. The attacker would not need access to the device, as the data is extracted when you use your virtual avatar during a video or live call.

In accordance with WiredA team of researchers from the University of Florida and Texas Tech University have developed a method to extract personal data from the Apple Vision Pro. GAZApploitThe attack exploits a vulnerability in gaze-driven text input when using an avatar. Exploit records and analyzes the avatar’s eye movements to reconstruct the text they wrote on the virtual keyboard.

GAZEploit is a unique attack because it does not require taking control of the device and can be carried out in any virtual meeting that your avatar is participating in. The attacker can pose as a recruiter and invite you to a Zoom video call to steal your information. “Based on the direction of the victim’s eye movements, the attacker can determine which key the victim is pressing,” said Hanqiu Wang, one of the study’s authors.

How GAZEploit Works on Apple Vision Pro

GAZEploit uses Vision Pro eye tracking, a key navigation feature on the device. This feature uses four infrared cameras that track our eye movements and detect when we are looking at any part of the environment.

During the analysis, experts found that the direction the gaze tends to be more focused and exhibits a periodic pattern as you type. When you open the virtual keyboard, your eye movement changes as you move between keys and stays fixed on the one you press. GAZEploit is based on an algorithm that detects keystrokes with over 85% accuracy.

To extract the information, the researchers built a recurrent neural network to recognize patterns in sequential data and used cross entropy. The network was trained on a dataset of 30 participants whose typing patterns and eye movements were studied.

While it sounds simple, the GAZEploit attack analyzes several values ​​to get its result. The researchers calculate the stability of the gaze trajectory, fixations, and saccades that occur when you move your gaze from one point to another. On the other hand, knowing the layout of the virtual keyboard is important, so the distance between the letters Q, P, Enter, and 123 is calculated.

In a series of tests, Hanqiu Wang and his team managed to predict the text of the message with 92.1% accuracy. When it came to passwords, the researchers scored 77% after five attempts, and the PIN was successful 73% of the time. GAZEploit could also guess a URL or email address with 86.1% accuracy.

Video on YouTube

Apple has already fixed the vulnerability in VisionOS 1.3.

Scientists describe GAZEploit as the first attack that uses gaze information to remotely infer keystrokes. Although the method has a high accuracy rate, never used publiclyThe tests were conducted in a closed environment with 30 participants, and there is no evidence that a third party has developed a similar method.

In any case, Apple has fixed the bug after the researchers shared the findings of GAZEploit. The VisionOS 1.3 update contains a reference to the vulnerability CVE-2024-40865. Apple mentions that the problemFixed avatar freezing when virtual keyboard is active..

Source: Hiper Textual

Previous articleThe LG brand, which left Russia, decided to return and submitted an application to Rospatent.
Next articleThe most destructive game in history is returning to the iPhone: Flappy Bird returns after 10 years
I'm Ben Stock, a highly experienced and passionate journalist with a career in the news industry spanning more than 10 years. I specialize in writing content for websites, including researching and interviewing sources to produce engaging articles. My current role is as an author at Gadget Onus, where I mainly cover the mobile section.

LEAVE A REPLY

Please enter your comment!
Please enter your name here