Two new tools put machine learning and front-facing cameras on smartphones to work detecting face and eye movements.
Users can scan their phone screen and select a task by smiling, raising eyebrows, opening their mouth, or looking to the left, right or up.
"To make Android more accessible for everyone, we're
launching new tools that make it easier to control your phone and communicate
using facial gestures," Google said.
The Centers for Disease Control and Prevention estimates
that 61 million adults in the United States live with disabilities, which has
pushed Google and rivals Apple and Microsoft to make products and services more
accessible to them.
"Every day, people use voice commands, like 'Hey
Google', or their hands to navigate their phones," the tech giant said in
a blog post.
"However, that's not always possible for people with
severe motor and speech disabilities."
The changes are the result of two new features, one is
called "Camera Switches," which lets people use their faces instead
of swipes and taps to interact with smartphones.
The other is Project Activate, a new Android application
which allows people to use those gestures to trigger an action, like having a
phone play a recorded phrase, send a text, or make a call.
"Now it's possible for anyone to use eye movements and
facial gestures that are customised to their range of movement to navigate
their phone — sans hands and voice," Google said.
The free Activate app is available in Australia, Britain,
Canada, and the United States at the Google Play shop.
Apple, Google, and Microsoft have consistently rolled out
innovations that make Internet technology more accessible to people with
disabilities or who find that age has made some tasks, such as reading, more
difficult.
Voice-commanded digital assistants built into speakers and
smartphones can enable people with sight or movement challenges to tell
computers what to do.
There is software that identifies text on web pages or in
images and then reads it aloud, as well as automatic generation of captions
that display what is said in videos.
An "AssistiveTouch" feature that Apple built into
the software powering its smart watch lets touchscreen displays be controlled
by sensing movements such as finger pinches or hand clenches.
"This feature also works with VoiceOver so you can
navigate Apple Watch with one hand while using a cane or leading a service
animal," Apple said in a post.
Computing colossus Microsoft describes accessibility as essential
to empowering everyone with technology tools.
"To enable transformative change accessibility needs to
be a priority," Microsoft said in a post.
"We aim to build it into what we design for every team,
organisation, classroom, and home."
0 comments:
Post a Comment