An Android Accessibility Feature In Beta Lets You Control Your Phone With Facial Expressions

Wednesday, September 1, 2021

It has been announced that Google is working on a new accessibility feature. Reportedly, the plan is to let Android users control their phone using facial expressions, such as a smile, frown, and raised eyebrows.

This is part of a ‘camera switch’ feature that is in the Accessibility Suite app, which has been released with Android 12’s fourth beta. However before you rush to Google Play to download it, it’s not available to use yet. If you want to try it though, there is an APK to sideload.

There are a number of other facial expressions that can be used to control the phone, these include looking in different directions, such as left, right, and up. Using these facial expressions can help the user navigate the phone by scrolling, going to the home screen, or going to notifications and seeing quick settings.

Alongside this, you will be able to adjust the sensitivity of the software. This means that if it doesn’t recognise your movements, or recognises too much, you can tailor the settings to hopefully prevent accidental usage. Apparently, using these features is very power-hungry. As a result of this, it is recommended for a phone to be plugged in if it’s being used.

These facial commands can be a great addition for people who might find touch controls difficult to use. While voice commands have been in use for some time now, people are less inclined to use them in quiet places or when in public. Facial commands can fill this gap because they are silent to use.

Android has had an Accessibility API for a while now, its purpose is to encourage and support developers to create apps that will help people with disabilities. The intention that Google has is for these apps to be categorised into a number of sections, these include screen readers, switch-based input systems, as well as input systems based on voice commands.

Google has its own app called the ‘Android Accessibility Suite’, so that people can access their devices, no matter what physical limitations they might have. The Beta version, which was included in the Android 12 beta release was released onto Pixel phones on August 11. With this update, the ‘Camera Switch’ was rolled out. As part of Switch Access, a user is able to connect another device through Bluetooth or USB, which allows them to choose items, scroll, and type, as well as other features. 

At the moment, Camera Switches works on a small amount of gestures and controls, however there is a chance that this list will grow as we move forward. Currently, you have control over what expressions initiate each command. This means that you can set the app to understand that when you smile, it should take you to the home screen.

These are the face gestures that the app can analyse:


  • Open mouth
  • Smile
  • Raise Eyebrows
  • Look Left
  • Look Right
  • Look Up


These are the actions that can be initiated with Camera Switch:

  • Pause Camera Switch
  • Toggle auto-scan (disabled)
  • Reverse auto-scan 
  • Select
  • Next
  • Previous
  • Touch & Hold
  • Scroll forward
  • Scroll backward
  • Home
  • Back
  • Notifications
  • Quick Settings
  • Overview

Unfortunately, some of the applications that have been created using the Accessibility Service API haven’t always been so helpful. The API is extremely powerful and used to help solve issues around people with disabilities using Android devices. But in the past, malicious apps have made use of the Accessibility Service API to spy on people. These apps did this by inserting fake overlays and intercepting commands, amongst others. Because of these negative actions, Google was forced to restrict access to the API back in 2017.

There was a lot of backlash about this from developers, whose creations needed access to the API and were legitimate applications. As a result of this, Google removed the restrictions that were in place. But recently it added this to the Google Play policy guidelines, ‘only services that are designed to help people with disabilities access their device or otherwise overcome challenges stemming from their disabilities are eligible to declare that they are accessibility tools.’

Despite some of the negative actions that have come from the openness of the API, it is still a fantastic resource that can be used to help people all over the world. It’s extremely exciting to see the sophistication of technology, particularly when it is used for good. While the facial recognition commands might be slightly limited at present, moving forward there is scope for this to become a lot more powerful.