With raised eyebrows or a smile, people with speech or physical impairments can now use their Android smartphones hands-free, Google said Thursday. Two new tools put machine learning and front cameras on smartphones to work to detect facial and eye movement. Users can scan their phone screen and select a task by smiling, raising eyebrows, opening their mouths, or looking left, right, or up.
“To make Android more accessible to everyone, we’re launching new tools that make it easier to control your phone and communicate using facial gestures,” Google said.
The Centers for Disease Control and Prevention estimates that 61 million adults in the United States live with a disability, which has prompted Google and its competitors Apple and Microsoft to make products and services more accessible.
“Every day, people use voice commands, like ‘Hey Google’, or their hands to navigate their phones,” the tech giant said in a blog post.
“However, this is not always possible for people with severe motor and speech disorders.”
The changes are the result of two new features, one called “Camera Switches,” which allows people to use their faces instead of swiping and pressing to interact with smartphones.
The other is Project Activate, a new Android app that lets people use these gestures to trigger an action, like playing a recorded phrase on a phone, sending a text, or making a call.
“Now, it’s possible for anyone to use eye movements and facial gestures tailored to their range of motion to navigate their phone, without hands or voices,” Google said.
The free Activate app is available in Australia, Great Britain, Canada and the United States from the Google Play store.
Apple, Google, and Microsoft have consistently rolled out innovations that make Internet technology more accessible to people with disabilities or find that age has made tasks, such as reading, more difficult.
Voice-activated digital assistants built into speakers and smartphones can allow people with sight or movement problems to tell computers what to do.
There is software that identifies text on web pages or in images and then reads it aloud, as well as automatic generation of captions that display what is said in videos.
An “AssistiveTouch” feature that Apple has integrated into the software powering its smartwatch allows touch screens to be controlled by detecting movements such as pinching fingers or squeezing hands.
“This feature also works with VoiceOver so you can navigate Apple Watch with one hand while using a cane or leading a service animal,” Apple said in an article.
The IT colossus Microsoft describes accessibility as essential to enable anyone to use technological tools.
“To enable transformative change, accessibility must be a priority,” Microsoft said in an article.
“We aim to integrate it into what we design for every team, organization, classroom and home.”