Every year mobile devices are getting bigger and bigger. This is nothing more than a market reaction to user preferences, which over time adapted and preferred large screens. However, this has its drawbacks, namely that the use of a smartphone with only one hand in many cases became almost a titanic task.
However, it now appears that this group of researchers has managed to find a solution… or at least part of it. as assembled AndroidPoliceResearchers at Carnegie Mellon University’s Future Interfaces Group have found an alternative, and it’s about being able to use our smartphone with just our eyes.
The aforementioned source comments in their report that, unlike many accessibility projects, this study aims to make everyone’s life easier usually. This is confirmed by the researchers in a press release.
Big tech companies like Google and Apple have come pretty close to eye prediction, but just looking at something is not enough. The real innovation of this project is the addition of a second modality, such as swiping the phone left or right, combined with gaze prediction. That’s what makes it powerful. It seems so obvious in hindsight, but this clever idea makes EyeMU much more intuitive.”
Chris Harrison, Associate Professor
This new invention is called EyeMU. Moreover, its purpose is quite simple and is that strive to improve the comfort of handling excessively large mobile phones from reality. On the Future Interfaces Group YouTube channel, the team posted a video showing it in action.
Accessibility Commitment
However, while technology is firmly committed to using vision to control a mobile phone, this will not be the only method. However, in the video uploaded by the researchers, we already see that the mobile can be worn heavily only with the appearance, combined with some gestures. Thus, the technology promises to be much more useful than it seems.
Take a photo of the application. If you look at an image, you select it. Zooming in on a face zooms in, and moving the phone left or right applies filters.
TechCrunch
Best of all, you don’t need any more tools than what’s already on the device. EyeMU works with the front camera of the terminalso you don’t need any accessories or external elements to use its accessibility technology.
Something the group seems to have understood quite well – unlike other similar technologies in the past – is that all control cannot fall to the eye. So we couldn’t do too much; and that’s why EyeMU offers a series of gestures to select, zoom in or change. The gaze, meanwhile, will mostly be used as a selection indicator.

Of course, this experiment is still in the experimental stage. However, EyeMU promises a bright future in the world of technologyand not just for people who need more accessibility on their mobile phone, but for everyone else in the world.
Source: Hiper Textual
