Users can scan their phone screen and select a task by smiling, raising eyebrows, opening their mouth, etc.
With raised eyebrows or a smile, people with speech or physical impairments can now use their Android smartphones hands-free, Google said Thursday.
Two new tools put machine learning and front cameras on smartphones to work to detect facial and eye movement.
Users can scan their phone screen and select a task by smiling, raising eyebrows, opening their mouths, or looking left, right, or up.
“To make Android more accessible to everyone, we’re launching new tools that make it easier to control your phone and communicate using facial gestures,” Google said.
The Centers for Disease Control and Prevention estimates that 61 million adults in the United States live with a disability, which has prompted Google and its competitors Apple and Microsoft to make products and services more accessible.
“Every day, people use voice commands, like ‘Hey Google’, or their hands to navigate their phones,” the tech giant said in a blog post.
“However, this is not always possible for people with severe motor and speech disorders.”
The changes are the result of two new features, one called “camera switches,” which allows people to use their faces instead of swiping and pressing to interact with smartphones.
The other is Project Activate, a new Android app that lets people use these gestures to trigger an action, like playing a recorded phrase on a phone, sending a text, or making a call.
“Now, it’s possible for anyone to use eye movements and facial gestures tailored to their range of motion to navigate their phone – without hands or voices,” Google said.
The free Activate app is available in Australia, Great Britain, Canada and the United States from the Google Play store.