As part of Accessibility Awareness Day 2024, Google is showing off some Android updates that should be helpful for people with mobility or vision issues.
Project Gameface allows players to use their faces to move the cursor and perform common clicking-type actions on the desktop, and now comes to Android.
The project allows people with limited mobility to use facial movements, such as raising an eyebrow, moving their mouth or turning their head, to activate a variety of functions. There are basic things like a virtual cursor, but also gestures where, for example, you can define the start and end of a swipe by opening your mouth, moving your head, and then closing it.
It can be customized to each person's capabilities and Google researchers are working with Include in India to test and improve the tool. Certainly, for many people, the ability to simply and easily play many of the thousands of games (well, probably millions, but thousands of good ones) on Android will be more than welcome.
There is a great video here that shows the product in action and being personalized; Jeeja there in the preview image talks about changing how much you need to move your head to activate the emote.
That kind of granular adjustment is as important as someone being able to set the sensitivity of their mouse or trackpad.
Another feature for people who can't easily operate a keyboard, on-screen or physical: a new textless “look to talk” mode that allows people to choose and send emojis on their own or as representatives of a phrase or action.
You can also add your own photos, so someone can have common phrases and emojis on speed dial and also commonly used contact images attached to their photos, all accessible with a few glances.
For people with vision problems, there are a variety of tools (of varying effectiveness, no doubt) that allow the user to identify the things the phone's camera sees. The use cases are countless, so sometimes it's best to start with something simple, like finding an empty chair or recognizing the person's keychain and pointing to it.
Users will be able to add custom objects or location awareness so that the instant description feature gives them what they need and not just a list of generic objects like “a cup and plate on a table.” What cup?!
Apple also showed off some accessibility features yesterday, and Microsoft has some too. Take a minute to examine these projects, which rarely get the mainstream treatment (although Gameface did) but are of great importance to those for whom they are designed.