Google I/O 2024 was marked by many new developments related to artificial intelligence, as well as Android 15. But the company also made a very important announcement that went unnoticed: the launch Gameface Projecttechnology that allows the user control applications on Android only using your face. This is an accessibility feature similar to the one Apple introduced and will come with iOS 18.

Google describes Project Gameface as “a mouse that allows people to control a computer cursor using head movements and facial gestures.” For example, the user can open your mouth to move the cursor, or raise your eyebrows to click and drag any element– the company explains in its blog.

For this Google uses the front camera built into the device itself, which is able to recognize the user’s facial gestures and associate each gesture with a specific action, such as smiling when pressed. Project Gameface also allows you to customize the intensity and speed of the gesture. Thus, the user will only be able to perform an action if the smile is more pronounced, or if he raises his eyebrows faster or slower, etc.

Google’s Gameface project can be used in Android applications

Google, which has been working on this project for some time, implements this system on Android via API from MediaPipe, which offers artificial intelligence and machine learning solutions related to facial recognition, landmarks, etc., and is capable of recognizing more than 52 facial gestures.

Currently the Google Gameface project It is available on Gitbuh. so that developers can easily implement it in their applications.

We’re opening up additional source code for Project Gameface to help developers create Android apps and make all Android devices more accessible. Through the device camera [Project Gameface] Seamlessly tracks facial expressions and head movements, translating them into intuitive, personalized control.

Google.

YouTube video

Although it’s not exactly the same technology as Google, Manzana She also did not miss the opportunity to introduce a new feature in iOS, with which users will be able to control the system, as well as some applications, using only their eyes.

This accessibility feature is coming to iOS 18 and iPadOS 18. uses the front camera, as confirmed by the company. “With Eye Tracking, users can navigate app elements and use persistence to activate each element, accessing additional features such as physical buttons, swipes and other gestures using just their eyes,” they say.

Source: Hiper Textual

Previous article13 New Android 15 Features Google Didn’t Announce at I/O 2024
Next articleBuffett’s company mentions a mysterious $6.7 billion investment
I'm Ben Stock, a highly experienced and passionate journalist with a career in the news industry spanning more than 10 years. I specialize in writing content for websites, including researching and interviewing sources to produce engaging articles. My current role is as an author at Gadget Onus, where I mainly cover the mobile section.

LEAVE A REPLY

Please enter your comment!
Please enter your name here